#robotics

LIVE
Dragonflies can see by switching “on” and “off” Researchers at the Universit

Dragonflies can see by switching “on” and “off”

Researchers at the University of Adelaide have discovered a novel and complex visual circuit in a dragonfly’s brain that could one day help to improve vision systems for robots.

Dr Steven Wiederman and Associate Professor David O'Carroll from the University’s Centre for Neuroscience Research have been studying the underlying processes of insect vision and applying that knowledge in robotics and artificial vision systems.

Their latest discovery, published this month in The Journal of Neuroscience, is that the brains of dragonflies combine opposite pathways - both an ON and OFF switch - when processing information about simple dark objects.

“To perceive the edges of objects and changes in light or darkness, the brains of many animals, including insects, frogs, and even humans, use two independent pathways, known as ON and OFF channels,” says lead author Dr Steven Wiederman.

“Most animals will use a combination of ON switches with other ON switches in the brain, or OFF and OFF, depending on the circumstances. But what we show occurring in the dragonfly’s brain is the combination of both OFF and ON switches. This happens in response to simple dark objects, likely to represent potential prey to this aerial predator.

"Although we’ve found this new visual circuit in the dragonfly, it’s possible that many other animals could also have this circuit for perceiving various objects,” Dr Wiederman says.

The researchers were able to record their results directly from ‘target-selective’ neurons in dragonflies’ brains. They presented the dragonflies with moving lights that changed in intensity, as well as both light and dark targets.

“We discovered that the responses to the dark targets were much greater than we expected, and that the dragonfly’s ability to respond to a dark moving target is from the correlation of opposite contrast pathways: OFF with ON,” Dr Wiederman says.

“The exact mechanisms that occur in the brain for this to happen are of great interest in visual neurosciences generally, as well as for solving engineering applications in target detection and tracking. Understanding how visual systems work can have a range of outcomes, such as in the development of neural prosthetics and improvements in robot vision.

"A project is now underway at the University of Adelaide to translate much of the research we’ve conducted into a robot, to see if it can emulate the dragonfly’s vision and movement. This project is well underway and once complete, watching our autonomous dragonfly robot will be very exciting,” he says.


Post link
KILL IT WITH FIRE. Also, preserve and celebrate it for its remarkably superior form.Read More: Human

KILL IT WITH FIRE. Also, preserve and celebrate it for its remarkably superior form.

Read More: Humans Are Getting Scarily Good At Creating Bipedal Beasts


Post link
Ren has seen TRUNDL.corp at its worst. Can she help make it its best? See how she got here, here.

Ren has seen TRUNDL.corp at its worst. Can she help make it its best?

See how she got here, here.


Post link
Tune in next week for at least two (2) thrills and up to three (3) chills!

Tune in next week for at least two (2) thrills and up to three (3) chills!


Post link
NOW IN DEVELOPMENT: TRUNDL.buddy and the Ghostly Wi-Filactery is a fast-paced tactical roadtrip gameNOW IN DEVELOPMENT: TRUNDL.buddy and the Ghostly Wi-Filactery is a fast-paced tactical roadtrip game

NOW IN DEVELOPMENT: TRUNDL.buddy and the Ghostly Wi-Filactery is a fast-paced tactical roadtrip game developed by SPITE HOUSE.

TRUNDL.corp’s newly-conscious flagship product awakens with a head full of voices and a singular impulse: DRIVE.

Swerve through countless (five) environments toward one of many (three) endings. Equip your fat little body with armor and weapons to face hostile bots sent to flatten your chassis. Collect glimpses of an inevitable future haunted by sound and color and also ghosts. TRUNDL your buddy hard enough and you may answer the voices’ call. But what will you say?

Grow a mind. Swerve the world. But mostly… NEVER. EVER. STOP.


Post link
 Implantable microrobots: Innovative manufacturing platform makes intricate biocompatible micromachi

Implantable microrobots: Innovative manufacturing platform makes intricate biocompatible micromachines

A team of researchers led by Biomedical Engineering Professor Sam Sia at Columbia Engineering has developed a way to manufacture microscale-sized machines from biomaterials that can safely be implanted in the body. Working with hydrogels, which are biocompatible materials that engineers have been studying for decades, Sia has invented a new technique that stacks the soft material in layers to make devices that have three-dimensional, freely moving parts. The study, published online January 4, 2017, in Science Robotics, demonstrates a fast manufacturing method Sia calls “implantable microelectromechanical systems” (iMEMS).

By exploiting the unique mechanical properties of hydrogels, the researchers developed a “locking mechanism” for precise actuation and movement of freely moving parts, which can provide functions such as valves, manifolds, rotors, pumps, and drug delivery. They were able to tune the biomaterials within a wide range of mechanical and diffusive properties and to control them after implantation without a sustained power supply such as a toxic battery. They then tested the “payload” delivery in a bone cancer model and found that the triggering of release of doxorubicin from the device over 10 days showed high treatment efficacy and low toxicity, at 1/10 of the standard systemic chemotherapy dose.

Read more.


Post link

materialsworld:

Week in Brief (28 August – 1 September)

Credit: Courtesy of the researchers/MIT

Although we may be unaware of it (or choose to rebel against it) we tend to observe codes of pedestrian conduct – keeping to the right, passing on the left, respecting personal space and not barging straight through people. Now, engineers at MIT have designed an autonomous robot that can do the same.

In tests, the knee-high robot, which runs on wheels, managed to avoid colliding with pedestrians while keeping up with their pace. In order to allow the robot to plan its movement, the team fitted sensors, including a webcam, a depth sensor and a lidar sensor. The robot was trained using reinforcement learning, involving computer simulations, to identify the optimum path through a crowd.

Yu Fan “Steven” Chen, lead author of the study, commented, ‘Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians […] For instance, small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces, such as shopping malls, airports, and hospitals.’

Details of the robot will be outlined in a paper presented at the IEEE Conference on Intelligent Robots and Systems in September.

To find out more visit, bit.ly/2gpdt6x

In other news:

-A proposal to make a Roman gold-mining area in Romania a UNESCO world heritage site may be halted to resume mining

Brazilian President’s attempt to revise a decree preventing mining in the Amazon has been blocked by the courts

-Dutch nuclear research institute has begun experiments on next-generation nuclear reactors powered by thorium

To find out more on materials science, packaging and engineering news, visit our website IOM3 at or follow us on Twitter @MaterialsWorld for regular news updates.

technology-org:

image

At the recent International Conference on Robotics and Automation, MIT researchers presented a printable origami robot that folds itself up from a flat sheet of plastic when heated and measures about a centimeter from front to back. The MIT researchers’ centimeter-long origami…

Read more

materialsworld:

In a major breakthrough for cancer research a team from Polytechnique Montréal, Université de Montréal and McGill University, Canada, have developed nanorobotic agents able to administer drugs with precision to targeted cancerous cells of tumours. Injecting medication in this way ensures the optimal targeting and avoids jeopardising organs and the surrounding tissue.

The nanorobotic agents are made of over 100 million flagellated bacteria, giving them the ability to self-propel. The agents are filled with drugs and move alone the most direct path between the injection point and the area of the body to cure. The propelling force of the agents is enough to travel and enter the tumours.

Once inside a tumour, the agents can detect oxygen-depleted areas, known as hypoxic zones, and deliver the drug to them. Hypoxic zones are resistant to most therapies, including radiotherapy.

The bacteria that make up the agents rely on two natural systems to move around. The synthesis of a chain of magnetic nanoparticles acts as a compass, allowing the bacteria to move in the direction of a magnetic field, while a sensor measuring oxygen concentration enables them to reach active tumour regions. The bacteria were exposed to a computer-controlled magnetic field, showing the researchers that it can perfectly replicate artificial nanorobots.

The researchers say that the nanorobots open the door for the synthesis of new vehicles for therapeutic, imaging and diagnostic agents, as well as having use in chemotherapy by eliminating the harmful side effects by targeting the affected area.

This AI robot will strengthen your ping-pong skills With all the recent talk of AI posing existentia

This AI robot will strengthen your ping-pong skills 

  • With all the recent talk of AI posing existential risks to humanity and our privacy, Japanese company Omron is taking a softer, more innocuous approach. Specifically, with its table tennis robot Forpheus, which strives to pursue “harmony of humans and machines” by patiently teaching us how to play ping-pong.
  • Although ping-pong ball-pitching machines like TrainerBot exist, Forpheus can actually live up to the feeling of playing against a real opponent.
  • It uses a robotic arm that is controlled by the AI through a 5-axis motor system to swing the paddle. The motion controller, or the “brain,” tells the machine how to hit the ball, advising it on timing and direction within a 1,000th of a second. Read More

Post link

MIT CSAIL’s ‘LaserFactory’ automates the full process for making functional devices in one system.

Though 3D printers have the capacity to build even life-saving components in a relatively short period of time, they lack the ability to fabricate more complex devices such as functional drones and robots that work straight out of the printer.

A group of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) developed a new platform with an eye-catching automated process: it prints functional, custom-made robotic devices without any human intervention.

The system, called “LaserFactory”, is composed of two parts: a software toolkit for users to design custom devices, and a hardware platform that automates the fabrication process.

In a press release, CSAIL Ph.D. student Martin Nisser describes the platform as a “one-stop-shop” that could be greatly beneficial to roboticists, educators, and product developers looking to quickly build a prototype.

“Making fabrication inexpensive, fast, and accessible to a layman remains a challenge,” explains Nisser. “By leveraging widely available manufacturing platforms like 3D printers and laser cutters, LaserFactory is the first system that integrates these capabilities and automates the full pipeline for making functional devices in one system.”

Democratizing the robot-building process

The researchers plan to reveal more about their project at an event in May, though the following description from the team’s press statement gives a good idea of the platform’s workings:

“Let’s say a user has aspirations to create their own drone. They’d first design their device by placing components on it from a parts library, and then draw on circuit traces, which are the copper or aluminum lines on a printed circuit board that allow electricity to flow between electronic components.

"They’d then finalize the drone’s geometry in the 2D editor. In this case, they’d use propellers and batteries on the canvas, wire them up to make electrical connections, and draw the perimeter to define the quadcopter’s shape.”

A video further lays out the process, which sees users view a preview of their design before the proprietary software translates their custom blueprint into single-file machine instructions that allow the device to be built in one go.

In terms of hardware, the laser cutter features an add-on that prints circuit traces and assembles components — it even allows the machine to dispense and cure silver to make circuit traces conductive.

Impressively, the method allows for a drone to fly right out of the sheet of material that was cut to make its shell.

Credit: Video by Alexandre de Bellefeuille - Photos by Malina Corpadean

By Shardell Joseph 

Robotic clothing made out of silicone, glass and Polyvinylidene fluoride have been added with electronic devices making each piece of the garment react to its surrounding colour spectrum.

image

The Montreal-based Fashion Designer, Ying Gao, titled the collection ‘flowing water, standing time’, with the aim of capturing ‘the essence of movement and stability over a period of time. Gao also wanted to focus on the flow of ‘energies’ through a garment, and mirroring the colours in its immediate surroundings.’

image

This project was inspired by neurologist Oliver Sacks’ novel, The Man who Mistook his Wife for a Hat, in which he relates the story of Jimmie G, a 49-year-old former sailor convinced of being aged 19 since having left the Navy.
Shocked by his own reflection when Sacks hands him a mirror, Jimmie reverts to his 19-year-old self as soon as his gaze leaves the reflective surface. Having lost any sense of temporal continuity, Jimmie lives as a prisoner to this single, perpetual moment, oscillating between a presence to the world and a presence to self.

image

Reflecting the characters journey throughout the novel, the garments perpetually change between two states as they react to the chromatic spectrum.  ‘In order to echo this varying mobility, the garments are capable of chromatic movement, said Gao. ‘Capable of recognising the colours in their immediate surroundings, they are at once liquid and chameleon-like, adapting to the slow rhythm of their ever-changing environment.

image

‘A mirror effect is at play – the garments are reacting to what they see. Much like Oliver Sacks’ patient, they alternate between what they are, and what they can potentially become – all the while embodying the inherent complexity of all things.’

Theseus. OC. You can read about him.

Theseus. OC. You can read about him.


Post link
CEO personal security robot

CEO personal security robot


Post link
Newton was developed by Synpet in 1989, the goal was to help people in schools, homes and offices. N

Newton was developed by Synpet in 1989, the goal was to help people in schools, homes and offices. Newton could walk (locomotion), talk (TTS and voice recognition), play games, handle calls (cordless phone), detect smoke and call emergency services, and play music. Featured an internet modem as well.

Here is a promo video that they distributed (and its wonderfully 80s, make sure to check out the theme song near the end of the video)

Did Newton actually accomplish all of this? Probably not given the technology (and the costs…) but it’s possible that a small subset of it did in closed environments!


Post link
robotics
 Light Painting MachineProject from Josh Sheldon showcases his light painting animation set-up, tran Light Painting MachineProject from Josh Sheldon showcases his light painting animation set-up, tran Light Painting MachineProject from Josh Sheldon showcases his light painting animation set-up, tran Light Painting MachineProject from Josh Sheldon showcases his light painting animation set-up, tran Light Painting MachineProject from Josh Sheldon showcases his light painting animation set-up, tran

Light Painting Machine

Project from Josh Sheldon showcases his light painting animation set-up, translating scenes created in 3D software Blender that are replicated with robotic controlled camera and LED:

I made this robot to make light painting animations. Each of the animations I made took between 4 and 12 hours to shoot, one frame at a time. Each frame is 1-3 long exposure photographs of the machine performing the light painting. 

More Here


Post link
 IMAGINATIVETelepresence art installation by H.O features two rooms, one with a display with eye-tra IMAGINATIVETelepresence art installation by H.O features two rooms, one with a display with eye-tra IMAGINATIVETelepresence art installation by H.O features two rooms, one with a display with eye-tra IMAGINATIVETelepresence art installation by H.O features two rooms, one with a display with eye-tra

IMAGINATIVE

Telepresence art installation by H.O features two rooms, one with a display with eye-tracking interface to control a ceramic cup, the other a chromokey-suited robotic arm to carry out instructions from the former room:

What if our thoughts materialized as tangible actions in remote locations?
IMAGINATIVE is an installation work consisting of two rooms.
In the first, there is a monitor displaying a table with a cup in an empty room. Using a specialized (eye tracking / brain computer) interface, participants can telekinetically control the cup in the remote space. The shifting of the cup back and forth, poised in the air, expresses someone’s imagination in real-time.
In the second room, the audience will encounter an extraordinary spectacle. Sitting on the table is a green robot arm enabling the cup to move, while the floor surrounding the table is covered in broken cups.
Given that brain computer interfaces directly connect our “consciousness” with the external world, what does “imagination” mean? What is the “imagination” that we present to the world? 

More Here


Post link

robotcore moodboard with bright colors + early 2000s tech for @omegasmileyface

credits

loading