Alexander Weber has a nice build log on his drawbot called Mechpen, that is available on GitHub:
This is Mechpen, my newest drawbot.
The idea was to have a robot arm that could sketch on a rather large surface.
It is a SCARA (Selective Compliance Assembly Robot Arm) robot arm, meaning the robot has a shoulder and an elbow joint and a hand. Mechpen has a reach of 140 cm which means it could sketch up to A0 format.
We are excited to share our latest and most ambitious robot, the Curiosity Mars Rover. This is a highly-interactive, 1/10th scale functional replica of the NASA Curiosity Mars Rover. This project was ambitious for us in two main ways: First, we worked very hard to make the robot visually accurate to the original NASA rover. This necessitated custom designing and manufacturing nearly every visible component on the robot. One of the key challenges was to get the required level of detail and functionality into such a small scale robot. Second, we encapsulated all the features and capabilities we wanted for this robot into a robust, maintainable, and modular electronics package based on a stack of custom Printed Circuit Boards (PCB) that we designed. This post focuses on the external view of the robot while future posts will focus on the electronics and functionality.
We present a computational design system that allows novices and experts alike to easily create custom robotic devices. The core of our work consists of a design abstraction that models the way in which electromechanical components can be combined to form complex robotic systems. We use this abstraction to develop a visual design environment that enables an intuitive exploration of the space of robots that can be created using a given set of actuators, mounting brackets and 3d-printable components. Our computational system also provides support for design auto-completion operations, which further simplifies the task of creating robotic devices. Once robot designs are finished, they can be tested in physically simulated environments and iteratively improved until they meet the individual needs of their users.
We believe in helping people understand how the world works. With so much development going on in robotics at the moment, now is the perfect time to get to know what it takes to build your own robot. The MeArm Pi is an award winning robotic arm kit that’s simple enough for a child to assemble. It integrates directly with the Raspberry Pi you know and love and you can either control it directly using the on-board joysticks or by programming it from the Pi in your favourite programming language.
The IMAV (International Micro Air Vehicle) conference and competition is a yearly flying robotics competition hosted by a different University every year. AKAMAV – a university student group at TU Braunschweig in Germany – have written up a fascinating and detailed account of what it was like to compete (and take first place) in 2016’s eleven-mission event hosted by the Beijing Institute of Technology.
AKAMAV’s debrief of IMAV 2016 is well-written and insightful. It covers not only the five outdoor and six indoor missions, but also details what it was like to prepare for and compete in such an intensive event. In their words, “If you share even a remote interest in flying robots and don’t mind the occasional spectacular crash, this place was Disney Land on steroids.”
The eleven missions were inspired by a hypothetical oil spill on an oil platform. They included:
Takeoff from a moving and rocking platform
Live mapping of a mission area to identify targets of interest
Precise delivery of a life buoy / life-preserver
Collection and delivery of a water sample
Landing on the same moving and rocking platform
Takeoff from a moving platform
Enter a target building from doorway, chimney, or window
Map the inside of the building
Pickup and release of specific objects (i.e. find specific objects and move them to designated locations)
Exit the building
Land on a moving platform
Multiple drones could be used but teams scored better the smaller the drones were, the more missions were done in a single flight, and the higher the difficulty of the challenges (i.e. choosing to land on a moving versus stationary platform.) Human piloting was a permitted option for the indoor missions, but for the outdoor missions flight had to be autonomous. Overall the less humans were involved, the better the team’s score.
For the outdoor missions, AKAMAV used two identical 550mm quads with 600g payloads. Flight controller was a pixhawk with APM:Copter and an onboard companion PC for other tasks, plus other components as needed. the indoor missions were 60mm Tiny Whoop quads for the starting, entering, and landing portions. The indoor mapping and pickup-and-release missions were handled by a modified 250mm racer using a 360 degree camera combined with Pix4D software.
The entire after-action report is worth a read but in particular their approach for the live mapping of an outdoor area is interesting. AKAMAV stated that there is no plug-and-play solution for live mapping an outdoor area, so they focused on that and were confident they could pull a win over their competition. They planned a hybrid approach of balancing on-board processing with off-board processing to use the strengths of each. In this they were not completely successful, but still happy with their results. “We succeeded somehow with a not so beautiful, yet real-time stitched map. […] In the end 5 out of 6 targets can be recognized, making us one of few teams solving the live mapping mission at IMAV 2016.” Check out their published conference paper: Towards a Real-Time Image Mosaicing Solution.
Embedded below is an early video of AKAMAV’s maiden test flight carrying their chosen 360-degree camera (a Ricoh Theta S) which they would later use to generate the 3D point cloud for indoor mapping at IMAV 2016. If you’re not familiar with these videos, pan around by clicking and dragging.
To really bring R2-D2 to life, I knew I had to do some great sturff with electronics. I started with a Taranis X9D controller and a pair of X8R receivers configured for a full 16-channels. I chose this platform mainly for its versatility, but also because I wanted to have separate control of the head and body with one remote controller (so the head can spin a full 360 degrees without tangling the wires).
Tote is a small (fits inside your palm), four-legged, walking robot, with three degrees of freedom per leg, Arduino for its brains and controlled either with a TV remote, or by additional electronics added on top of it. It is very simple, cheap and sturdy, for this class of robots. Its goal is to be a starting point for anyone who wants to start building multi-legged robots.
Here are the python scripts that send serial commands to the motor controllers.
Here are the eagle files, gerbers, and BOM for the motor controllers and sensor boards. When I sent the boards to 3PCB, the text the motor controllers got all scrambled so keep that in mind. At this point I’ve built up three of each board, and they all work. I haven’t even blown up a single FET yet, in all of my motor control derping so far.
Here are my CAD files for the motor, gearbox, motor module, and leg. Requires Solidworks 2015-2016 to open. Many of the gearbox files have HSMWorks CAM in them, so you’ll need the full version of HSMWorks to view the CAM. There’s also a list of the gears I got from KHK and the post-machining I did on them.