Close Menu
    Facebook X (Twitter) Instagram
    SciTechDaily
    • Biology
    • Chemistry
    • Earth
    • Health
    • Physics
    • Science
    • Space
    • Technology
    Facebook X (Twitter) Pinterest YouTube RSS
    SciTechDaily
    Home»Technology»Mimicking the Human Eye, Researchers Revolutionize Robotic Cameras
    Technology

    Mimicking the Human Eye, Researchers Revolutionize Robotic Cameras

    By University of MarylandJuly 23, 2024No Comments6 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email
    Robotic Eye Computer Vision Art Concept
    Researchers at the University of Maryland have developed the Artificial Microsaccade-Enhanced Event Camera (AMI-EV), inspired by human eye movements, to enhance robotic vision by reducing motion blur in dynamic environments. (Artist’s concept.) Credit: SciTechDaily.com

    New camera mimics the involuntary movements of the human eye to create sharper, more accurate images for robots, smartphones, and other image-capturing devices.

    Computer scientists have invented a camera mechanism that improves how robots see and react to the world around them. Inspired by how the human eye works, the research team, led by the University of Maryland, developed an innovative camera system that mimics the tiny involuntary movements used by the eye to maintain clear and stable vision over time. The team’s prototyping and testing of the camera—called the Artificial Microsaccade-Enhanced Event Camera (AMI-EV)—was detailed in a paper that was recently published in the journal Science Robotics.

    Advancements in Event Camera Technology

    “Event cameras are a relatively new technology better at tracking moving objects than traditional cameras, but today’s event cameras struggle to capture sharp, blur-free images when there’s a lot of motion involved,” said the paper’s lead author Botao He, a computer science Ph.D. student at UMD. “It’s a big problem because robots and many other technologies—such as self-driving cars—rely on accurate and timely images to react correctly to a changing environment. So, we asked ourselves: How do humans and animals make sure their vision stays focused on a moving object?”

    Novel Event Camera System
    Depiction of novel event camera system versus standard event camera system. Credit: Botao He, Yiannis Aloimonos, Cornelia Fermuller, Jingxi Chen, Chahat Deep Singh

    Mimicking Human Eye Movements

    For He’s team, the answer was microsaccades, small and quick eye movements that involuntarily occur when a person tries to focus their view. Through these minute yet continuous movements, the human eye can keep focus on an object and its visual textures—such as color, depth, and shadowing—accurately over time.

    “We figured that just like how our eyes need those tiny movements to stay focused, a camera could use a similar principle to capture clear and accurate images without motion-caused blurring,” He said.

    Technological Implementation and Testing

    The team successfully replicated microsaccades by inserting a rotating prism inside the AMI-EV to redirect light beams captured by the lens. The continuous rotational movement of the prism simulated the movements naturally occurring within a human eye, allowing the camera to stabilize the textures of a recorded object just as a human would. The team then developed software to compensate for the prism’s movement within the AMI-EV to consolidate stable images from the shifting lights.

    Microsaccades Counteract Visual Fading
    A demonstration of how microsaccades counteract visual fading. After a few seconds of fixation (staring) on the red spot in this static image, the background details of this image begin to visually fade. This is because microsaccades have been suppressed during this time and the eye cannot provide effective visual stimulation to prevent peripheral fading. Credit: UMIACS Computer Vision Laboratory

    Study co-author Yiannis Aloimonos, a professor of computer science at UMD, views the team’s invention as a big step forward in the realm of robotic vision.

    “Our eyes take pictures of the world around us and those pictures are sent to our brain, where the images are analyzed. Perception happens through that process and that’s how we understand the world,” explained Aloimonos, who is also director of the Computer Vision Laboratory at the University of Maryland Institute for Advanced Computer Studies (UMIACS). “When you’re working with robots, replace the eyes with a camera and the brain with a computer. Better cameras mean better perception and reactions for robots.”

    Potential Impact on Various Industries

    The researchers also believe that their innovation could have significant implications beyond robotics and national defense. Scientists working in industries that rely on accurate image capture and shape detection are constantly looking for ways to improve their cameras—and AMI-EV could be the key solution to many of the problems they face.

    “With their unique features, event sensors and AMI-EV are poised to take center stage in the realm of smart wearables,” said research scientist Cornelia Fermüller, senior author of the paper. “They have distinct advantages over classical cameras—such as superior performance in extreme lighting conditions, low latency, and low power consumption. These features are ideal for virtual reality applications, for example, where a seamless experience and the rapid computations of head and body movements are necessary.”

    Artificial Microsaccade-Enhanced Event Camera Diagram
    Depiction of novel event camera system versus standard event camera system. Credit: Botao He, Yiannis Aloimonos, Cornelia Fermuller, Jingxi Chen, Chahat Deep Singh

    Enhancements in Real-Time Image Processing

    In early testing, AMI-EV was able to capture and display movement accurately in a variety of contexts, including human pulse detection and rapidly moving shape identification. The researchers also found that AMI-EV could capture motion in tens of thousands of frames per second, outperforming most typically available commercial cameras, which capture 30 to 1000 frames per second on average. This smoother and more realistic depiction of motion could prove to be pivotal in anything from creating more immersive augmented reality experiences and better security monitoring to improving how astronomers capture images in space.

    Conclusion and Future Outlook

    “Our novel camera system can solve many specific problems, like helping a self-driving car figure out what on the road is a human and what isn’t,” Aloimonos said. “As a result, it has many applications that much of the general public already interacts with, like autonomous driving systems or even smartphone cameras. We believe that our novel camera system is paving the way for more advanced and capable systems to come.”

    Reference: “Microsaccade-inspired event camera for robotics” by Botao He, Ze Wang, Yuan Zhou, Jingxi Chen, Chahat Deep Singh, Haojia Li, Yuman Gao, Shaojie Shen, Kaiwei Wang, Yanjun Cao, Chao Xu, Yiannis Aloimonos, Fei Gao and Cornelia Fermüller, 29 May 2024, Science Robotics.
    DOI: 10.1126/scirobotics.adj8124

    In addition to He, Aloimonos, and Fermüller, other UMD co-authors include Jingxi Chen (B.S. ’20, computer science; M.S. ’22, computer science) and Chahat Deep Singh (M.E. ’18, robotics; Ph.D. ’23, computer science).

    This research is supported by the U.S. National Science Foundation (Award No. 2020624) and National Natural Science Foundation of China (Grant Nos. 62322314 and 62088101). This article does not necessarily reflect the views of these organizations.

    Computer Science Robotics University of Maryland Vision
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

    Related Posts

    Machine-Learning Models Capture Subtle Variations in Facial Expressions

    New Low-Power Chip Will Help Miniature Drones Navigate

    MIT Engineers Develop Autonomous Glider That Can Fly and Sail

    New Chip Design Method May Result in Miniature Smart Drones

    Engineers Developing ‘Hedgehog’ Robots That Hop and Tumble in Microgravity

    New Algorithm Should Enable Household Robots to Better Identify Objects

    Printable Robots That Self-Assemble When Heated

    Algorithms Improve AUV Navigation and Detecting Capabilities

    Algorithm Enables Robots to Learn and Adapt to Help Complete Tasks

    Leave A Reply Cancel Reply

    • Facebook
    • Twitter
    • Pinterest
    • YouTube

    Don't Miss a Discovery

    Subscribe for the Latest in Science & Tech!

    Trending News

    Could Perseverance’s Mars Samples Hold the Secret to Ancient Life?

    Giant Fossil Discovery in Namibia Challenges Long-Held Evolutionary Theories

    Is There Anybody Out There? The Hunt for Life in Cosmic Oceans

    Paleontological Surprise: New Research Indicates That T. rex Was Much Larger Than Previously Thought

    Photosynthesis-Free: Scientists Discover Remarkable Plant That Steals Nutrients To Survive

    A Waste of Money: New Study Reveals That CBD Is Ineffective for Pain Relief

    Two Mile Long X-Ray Laser Opens New Windows Into a Mysterious State of Matter

    650 Feet High: The Megatsunami That Rocked Greenland’s East Coast

    Follow SciTechDaily
    • Facebook
    • Twitter
    • YouTube
    • Pinterest
    • Newsletter
    • RSS
    SciTech News
    • Biology News
    • Chemistry News
    • Earth News
    • Health News
    • Physics News
    • Science News
    • Space News
    • Technology News
    Recent Posts
    • Mystery Solved: Scientists Discover Unique Evolutionary Branch of Snakes
    • Unlocking the Deep Past: New Study Maps the Dawn of Animal Life
    • Scientists Uncover How Cocaine Tricks the Brain Into Feeling Good – Breakthrough Could Lead to New Substance Abuse Treatments
    • Scientists Sound the Alarm: Record Ocean Heat Puts the Great Barrier Reef in Danger
    • New Study Unravels the Mystery of COVID’s Worst Pediatric Complication
    Copyright © 1998 - 2024 SciTechDaily. All Rights Reserved.
    • Latest News
    • Trending News
    • Privacy Policy
    • Terms of Use

    Type above and press Enter to search. Press Esc to cancel.