SOFT DRONE
INTERNSHIP PROJECT AT THE MIT SPARK LAB
JUNE - AUGUST 2024
Full project title:
Design, Construction, and Flight Testing of an Improved Soft Drone with a Pivoting Camera
JUNE - AUGUST 2024
Full project title:
Design, Construction, and Flight Testing of an Improved Soft Drone with a Pivoting Camera
I conducted this project as my summer internship task in the Sensing, Perception, Autonomy, and Robot Kinetics (SPARK) lab at MIT. My task was to design and test a soft drone, improving over the lab's previous soft drone design.
My final design was 43% larger, reduced the ground effect on the drone while grasping by 60%, and implemented a new computer controlled camera pivot that nearly doubled the effective FOV of the depth tracking camera and can be used to reduce motion blur while grasping (an attribute essential for SLAM and object tracking).
Soft drones are a new class of quadcopter that use compliant ("squishy") grippers to grasp more general objects, and the lab uses them to conduct research in autonomous robotics and robot perception.
Design Techniques: Computer Aided Design (CAD), PID Tuning
Manufacturing: 3D Printing, Soldering, Hand-Assembly
Software Applications: Fusion 360, QGroundControl, Ardupilot
Programming Languages & Tools: Python, Robot Operating System 2 (ROS2), Linux, Git/GitHub, Visual Studio Code
Hardware: Intel RealSense Cameras, NVIDIA Jetson Xavier, Pixhawk, Soft Robotic Grippers
This powerpoint was used to present my project at the Research Science Institute (RSI), MIT's premier summer research program. It provides a high level summary of the entire project, details its significance, and is useful for contextualizing the various engineering components used in this project.
Pictures and descriptions from throughout the project.
2D CAD Drawings of the drone.
Figure from the full paper describing the components of the drone.
Me holding up the soft drone.
A closer view of the depth camera and gripper arrangement.
Electrical system diagram.
Early design sketches, with an emphasis on the depth camera.
Camera mount versioning in CAD. Version 1 was the minimal viable product. Version 2 added a servo to gimbal the camera, providing the ability to counteract motion blur and increase field of view. Version 3 simplified the design, using fewer, smaller components to complete the same task as Version 2.
Real photo of pivoting depth camera mount (two photos superimposed to show motion).
CAD model of pivoting depth camera mount (two images superimposed to show motion).
Testing the depth camera on myself.
Testing the depth camera's interaction with the gripper. The gripper would interfere with the depth camera's FOV in the previous drone, rendering it unable to track objects while the gripper was open.
ROS2 node diagram.
Reference frames of the drone and the camera.
Redesigned gripper mount to reduce ground effect by placing the gripper further from the plane of the propellers.
Using the Prusa slicer to generate GCODE for the lab 3D printer.
The setup bringing all the supplies from the lab to the flight testing room.
Me working on the soft drone, while it rests on the PVC pipes I modified as a stand.
Using a monitor to boot up the NVIDIA Jetson and activate its depth camera before flight.
Testing the depth camera on the drone.
The soft drone in flight.
Showing the view of the drone as it approaches an object to grasp (a pepsi bottle)
Damaged gripper after a crash due to an improperly tuned PID controller (the PID controller was not re-tuned after the additional weight of the depth sensor setup was added). The gripper was repaired, and I retuned the PID controller.
Tuning the drone's actuators in QGroundControl.