Merge AR 3D content in the real world
Updated a year ago
Creating a realistic AR experience with ARCore.
First thing that i thought was find a way to project shadows on a transparent surface. ArCore package comes with a shader that calculate the ambient light and apply on the object. So I found a transparent shader that renders only the shadows instead of the whole thing. SO I did a test with a primitive plane:
So far so good, but when I tried to apply the same material on the TrackedPlaneVisualizer prefab and run the project on the device I got this result:

ArCore creates the mesh using their internal system. The ARBackground shader blends the camera capturing with the 3Dobjects in the scene, so I guess there might be something related to this. Because the resulted mesh is opaque and the shader I am using is transparent so….
I found a project on GitHub that might solve my problem: This guy claims that you can project a shadow on a surface using a transparent shader created by him. So I tried but unfortunately I got some flickering and no shadows at all:

So what to do now ? Since I am not an expert with shading programming, I decided to use the first solution ( that one with the transparent ground plane ). So I create a prefab and put it in the Andy slot, so I got this:
As we can see, the shadow is low resolution, I mean, I have put the highest resolution for shadows and I still got no good solutions with this. So next step is: not render the ground plane rendered from arcore and find a way to make a better shadow resolution for tiny objects. After searching for a solution I found that you can tweak some shadow values at the quality settings. You can set the distance of the shadow for tiny objects. I know it is not recommended to scale down objects because of physics, ARInterface shows that but for in this case there is no problem to scale down for a simple project as this one:

You can check the whole project on the github page:
Pauleta Milanez
Unity Developer - Programmer