The Eye Setup page is designed to be your go-to page when setting up all the technical elements for your installation (everything other than your experiential content). You can get to it by going to your installations tab and clicking "Edit Eye Setup" on the installation you'd like to edit.
Using the Eye Setup page, you can
Clean the depth filtered image received from the eye (this tutorial)
Set the camera and content window details (go to this tutorial)
Cleaning Depth Filtered
In your Eye Setup page, you have the basic settings that you need to set for every installation, and you have advanced options if you'd like to calibrate it further.
The basic settings are:
How deep should it look?
What depth should it pay attention to?
The advanced options are:
What size of objects should it respond to?
Reduce and Clean
All of these changes specifically apply to the transformation between Depth Original to Depth Filtered, where Depth Filtered is what is used to generate the dynamic content in your experiences.
When setting up your eye, the main element you need to set is what depth area you would like to monitor. You can turn any depth range interactive and even learn to ignore existing furniture.
How deep should it look? / What depth should it pay attention to?
When setting the depth range, you can set any range you would like (it can be between 5ft to 7ft away from the camera or people between 2ft to 14ft away, etc.). This requires two steps:
Setting "How deep should it look?"
Setting "How far" and "How near" in "What depth should it pay attention to?"
"How deep should it look?" tells the depth camera how far to look. Unless you'd like specific precision, you can simply set it to the maximum value. Note that this does not work in "Simulation" mode. You'll see it reflected in the depth original on your control window:
"What depth should it pay attention to?" lets you set exactly what distance from the depth camera it will monitor. Since Zuzor supports various depth cameras and exact depth values vary depending on ambient lighting for some cameras, it is difficult to set the range from the sliders alone. A best practice is to have someone (or something) at the limit of how far you would like to monitor and reduce the "How far?" slider until they are out of view and then slowly increase it until they are in full view. Similarly to set the near limit, have someone at the limit of how close you would like to monitor and increase the "How near?" slider until they are out of view and then slowly reduce it until they are back in full view. Many times, you can leave it at the maximum range ("How near?" set to 0 and "How far?" set to 255) and simply click "Learn Background" if furniture, the ceiling or the floor comes into view (or you can crop them out - scroll down to "Crop Right/Left/Top/Bottom" to learn how). If you're installing at an area where large groups walk through, you will likely want to limit the monitored area to only an area close to the screen.
If there is furniture or fixtures within your desired depth range, they will automatically be detected and will affect the dynamic content. To remove them, you can click "Learn Background" and Zuzor will spend the next 15 to 30 seconds learning the environment to automatically remove all background elements it finds. Note that during the time it learns everyone must be out of the field of view; otherwise it will leave a trace of them in the depth filtered screen. If you're launching the "Learn Background" process from the web interface, please allow a minute for your Zuzor app to receive the command and start learning. Note that there is no limit to the number of times you can "Learn Background." You do not need to re-learn every time you start the app, it will automatically apply your previously learned environment. You do need to re-learn if you change the "How deep should it look?" parameter, reposition the depth camera or move furniture around.
When installing a depth camera with your installation, a range of variables arises affecting the exact image quality that the depth image receives. Such variables include the type of depth camera in use, the amount and type of ambient lighting, the distance and relative position of the camera to the area being monitored, and more. To account for the range of possible scenarios, you have a variety of advanced options to calibrate to your desired image.
Why should you care? In most cases, you won't and can leave it as is. In some scenarios where you may have a particular environment that is noisy, or you may have a changing lighting condition (such as having an orbbec, a sunlight-sensitive depth camera, watching a space with direct sunlight reflecting off of an object) affecting the perceived depth of objects over time, you may want to clean it further.
It's common for the field of view from the camera eye to be greater than the area you'd like to activate. Note that if the field of view from the camera is too small, consider moving the depth camera further away or changing the layout (such as changing from positioning the depth camera below the display to across from the display). To account for areas in the field of view that you do not want to monitor, and to save computing power by reducing unnecessary dynamic content, you can crop the depth filtered from the top, bottom, left, and right. Is designed such that:
Crop Left: crops from the very left to the value of Crop Left - defaults to 0 where it does not apply any cropping
Crop Right: crops from the very right to the value of Crop Right - defaults to 640 where it does not apply any cropping
Crop Top: crops from the very top to the value of Crop Top - defaults to 480 where it does not apply any cropping
Crop Bottom: crops from the very bottom to the value of Crop Bottom - defaults to 0 where it does not apply any cropping
What size of objects should it respond to?
In environments where there are small objects or noise in the area you're trying to activate you are able to ignore those objects manually. The easiest way to ignore furniture and elements in the environments is through "Learn Environment" (as covered above), but otherwise, you're able to set what range of object sizes you'd like to monitor. To set the object range manually, use the "What size of objects should it respond to?"
For example, you can see here that "What depth should it pay attention to? " is set beyond the fixtures in the back. (A limitation here is that if the person steps back, then they will no longer be monitored)
One way to address this is to increase the "How far?" slider in "What depth should it pay attention to?" but then we capture some background fixtures:
You can manually ignore those fixtures by setting the "min size" in "What size of objects should it respond to?" to a higher number until you see those fixtures disappear.
Reduce and Clean / Enlarge / Blur:
In scenarios where you need to remove small noise from the depth feed, you will often use "Reduce and Clean" to remove that noise. Using "Reduce and Clean" may shrink the silhouette of people seen through depth filtered, to compensate for this, you will often use a combination of "Enlarge" and "Blur."
Reduce and Clean
True to its name, "Reduce and Clean" reduces the outline of all objects perceived. At times, this can cause elements to disappear or shrink to a level past what you intend. You can see the difference here:
"Enlarge" enlarges the outline of all objects in the area monitored. If people are particularly far from the depth camera, you may want to use this to make them more visible on the screen. Often, you may use this to counteract the effect of "Reduce and Clean."
"Blur" blurs the outline of elements in depth filtered. This is often helpful to smoothen out the silhouette of people and make the dyanmic content cleaner in general
We're here to help
Now you have a basic idea of how to set up your eye for any installation! Be fearless and experiment to get comfortable with it. We're always here to help. Send us a message through the chat icon on the right, email us or call us for any assistance. If you'd like to make sure we're reachable during a specific time you're setting up, email us and we'll make sure to accommodate your needs.