#robotics
Week in Brief (28 August – 1 September)
Credit: Courtesy of the researchers/MIT
Although we may be unaware of it (or choose to rebel against it) we tend to observe codes of pedestrian conduct – keeping to the right, passing on the left, respecting personal space and not barging straight through people. Now, engineers at MIT have designed an autonomous robot that can do the same.
In tests, the knee-high robot, which runs on wheels, managed to avoid colliding with pedestrians while keeping up with their pace. In order to allow the robot to plan its movement, the team fitted sensors, including a webcam, a depth sensor and a lidar sensor. The robot was trained using reinforcement learning, involving computer simulations, to identify the optimum path through a crowd.
Yu Fan “Steven” Chen, lead author of the study, commented, ‘Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians […] For instance, small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces, such as shopping malls, airports, and hospitals.’
Details of the robot will be outlined in a paper presented at the IEEE Conference on Intelligent Robots and Systems in September.
To find out more visit, bit.ly/2gpdt6x
In other news:
To find out more on materials science, packaging and engineering news, visit our website IOM3 at or follow us on Twitter @MaterialsWorld for regular news updates.
At the recent International Conference on Robotics and Automation, MIT researchers presented a printable origami robot that folds itself up from a flat sheet of plastic when heated and measures about a centimeter from front to back. The MIT researchers’ centimeter-long origami…
In a major breakthrough for cancer research a team from Polytechnique Montréal, Université de Montréal and McGill University, Canada, have developed nanorobotic agents able to administer drugs with precision to targeted cancerous cells of tumours. Injecting medication in this way ensures the optimal targeting and avoids jeopardising organs and the surrounding tissue.
The nanorobotic agents are made of over 100 million flagellated bacteria, giving them the ability to self-propel. The agents are filled with drugs and move alone the most direct path between the injection point and the area of the body to cure. The propelling force of the agents is enough to travel and enter the tumours.
Once inside a tumour, the agents can detect oxygen-depleted areas, known as hypoxic zones, and deliver the drug to them. Hypoxic zones are resistant to most therapies, including radiotherapy.
The bacteria that make up the agents rely on two natural systems to move around. The synthesis of a chain of magnetic nanoparticles acts as a compass, allowing the bacteria to move in the direction of a magnetic field, while a sensor measuring oxygen concentration enables them to reach active tumour regions. The bacteria were exposed to a computer-controlled magnetic field, showing the researchers that it can perfectly replicate artificial nanorobots.
The researchers say that the nanorobots open the door for the synthesis of new vehicles for therapeutic, imaging and diagnostic agents, as well as having use in chemotherapy by eliminating the harmful side effects by targeting the affected area.
MIT CSAIL’s ‘LaserFactory’ automates the full process for making functional devices in one system.
Though 3D printers have the capacity to build even life-saving components in a relatively short period of time, they lack the ability to fabricate more complex devices such as functional drones and robots that work straight out of the printer.
A group of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) developed a new platform with an eye-catching automated process: it prints functional, custom-made robotic devices without any human intervention.
The system, called “LaserFactory”, is composed of two parts: a software toolkit for users to design custom devices, and a hardware platform that automates the fabrication process.
In a press release, CSAIL Ph.D. student Martin Nisser describes the platform as a “one-stop-shop” that could be greatly beneficial to roboticists, educators, and product developers looking to quickly build a prototype.
“Making fabrication inexpensive, fast, and accessible to a layman remains a challenge,” explains Nisser. “By leveraging widely available manufacturing platforms like 3D printers and laser cutters, LaserFactory is the first system that integrates these capabilities and automates the full pipeline for making functional devices in one system.”
Democratizing the robot-building process
The researchers plan to reveal more about their project at an event in May, though the following description from the team’s press statement gives a good idea of the platform’s workings:
“Let’s say a user has aspirations to create their own drone. They’d first design their device by placing components on it from a parts library, and then draw on circuit traces, which are the copper or aluminum lines on a printed circuit board that allow electricity to flow between electronic components.
"They’d then finalize the drone’s geometry in the 2D editor. In this case, they’d use propellers and batteries on the canvas, wire them up to make electrical connections, and draw the perimeter to define the quadcopter’s shape.”
A video further lays out the process, which sees users view a preview of their design before the proprietary software translates their custom blueprint into single-file machine instructions that allow the device to be built in one go.
In terms of hardware, the laser cutter features an add-on that prints circuit traces and assembles components — it even allows the machine to dispense and cure silver to make circuit traces conductive.
Impressively, the method allows for a drone to fly right out of the sheet of material that was cut to make its shell.
Credit: Video by Alexandre de Bellefeuille - Photos by Malina Corpadean
By Shardell Joseph
Robotic clothing made out of silicone, glass and Polyvinylidene fluoride have been added with electronic devices making each piece of the garment react to its surrounding colour spectrum.
The Montreal-based Fashion Designer, Ying Gao, titled the collection ‘flowing water, standing time’, with the aim of capturing ‘the essence of movement and stability over a period of time. Gao also wanted to focus on the flow of ‘energies’ through a garment, and mirroring the colours in its immediate surroundings.’
This project was inspired by neurologist Oliver Sacks’ novel, The Man who Mistook his Wife for a Hat, in which he relates the story of Jimmie G, a 49-year-old former sailor convinced of being aged 19 since having left the Navy.
Shocked by his own reflection when Sacks hands him a mirror, Jimmie reverts to his 19-year-old self as soon as his gaze leaves the reflective surface. Having lost any sense of temporal continuity, Jimmie lives as a prisoner to this single, perpetual moment, oscillating between a presence to the world and a presence to self.
Reflecting the characters journey throughout the novel, the garments perpetually change between two states as they react to the chromatic spectrum. ‘In order to echo this varying mobility, the garments are capable of chromatic movement, said Gao. ‘Capable of recognising the colours in their immediate surroundings, they are at once liquid and chameleon-like, adapting to the slow rhythm of their ever-changing environment.
‘A mirror effect is at play – the garments are reacting to what they see. Much like Oliver Sacks’ patient, they alternate between what they are, and what they can potentially become – all the while embodying the inherent complexity of all things.’
robotcore moodboard with bright colors + early 2000s tech for @omegasmileyface