Hello! I'm Ryan Weideman.


About

I'm a software engineer, most recently working at Amazon Lab126 on the Amazon Astro Home Robot. I worked on the embedded autonomy platform which performed critical functions of the robot, such as motion and navigation. I designed, implemented, and maintained core features and components that execute at scale across the fleet of customer devices.

I hold M.S. and B.S. degrees in Electrical and Computer engineering from Cal Poly in San Luis Obispo. While working toward my degrees I did software and firwmare engineering internships at Amazon, was an incoming student orientation leader, and a member of the robotics club.

Feel free to contact me on LinkedIn.


Portfolio


Amazon Astro Home Robot

I worked on Astro, joining the program early in its development through to the product's public launch and first customer ship.

My team was responsible for developing the embedded software that controls the motion of the robot's head and body, and the behavior of the robot. Essentially anytime the robot moves, I can point and say "That's my code!"

We heavily leveraged the Robot Operating System (ROS), an open-source event-driven distributed computing framework and collection of tooling designed for robotics applications (note: ROS is more accurately a middleware. The "OS" name is a misnomer).

I designed and implemented ROS nodes, triaged and fixed urgent production bugs, supported customer teams, designed test plans with QA, mentored team members, and more.

Main Technologies: C/C++, Python, Robot Operating System (ROS), MATLAB, Docker, Git


Robot Navigation in Cluttered Environments with Deep Reinforcement Learning

The aim of this project, my master's thesis research, was to apply deep reinforcement learning to enable a mobile robot to navigate a cluttered room using cheap and accessible camera sensors. The video to the right is a demo of what the robot sees as it successfully avoiding obstacles.

I experimented with extending this model so that the robot could learn to navigate toward a target position while avoiding obstacles. I leveraged AWS to accelerate simulation and model training, and analyzed data to show that my extended model worked to succesfully navigate different cluttered environments. The work is published here.

Main Technologies: Python, AWS EC2, MATLAB, Deep Learning, Robot Operating System (ROS)


Digital Video Streaming for Nano Quadcopters

The objective of this project was to create a digital video streaming platform for consumer quadcopters. My team (two other engineers) and I designed a custom PCB with a WiFi-enabled MCU and mobile phone camera for the CrazyFlie nano-quadcopter platform.

The images to the left show the nano-quadcopter outfitted with our platform, along with a snapshot of us from the quadcopter's real-time video feed.

In our MCU selection and design process we faced tradeoffs between weight, flight time, video framerate, and image resolution. We identified the Texas Instrument CC3220 MCU, designed for IOT applications, as the ideal MCU for our digital streaming platform. It had both a built in WiFi networking subsystem and a parallel 8-bit peripheral for interfacing with low cost camera sensors.

My primary contributions were:

  • A I2C firmware driver to configure the camera sensor framerate, resolution, JPEG compression, etc
  • A WiFi firmware driver for streaming compressed images via a custom UDP based networking protocol
  • Java application that listens for the UDP packets, decompresses the video stream, and displays it to the user
Components Peripheral Interface
TI CC3220 MCU ---
MT9D111 Camera I2C & 8-Bit Parallel Interface
WiFi Networking Subsystem (Internal to MCU)
UART to USB UART

See our published final project report for more details at this link.


LIDAR Mapping Robot


This goal of this personal project was to build a robot that could map out the floorplan of a room. It featured a custom designed and 3D printed LIDAR system, prototype hardware based on the STM32F4 MCU, and a python application to visualize LIDAR data.

In the video to the right you can see the floorplan of my apartment as the LIDAR sweeps around the room. The color of each point indicates the distance away from the sensor.

The image included is of my prototype robot "motherboard", which has a STM32F4 development board, connectors for the LIDAR I2C, IMU, stepper motor driver, UART to USB converter, and more.

Firmware was written to do the following:

  • Configure the LIDAR sensor via I2C
  • Step the stepper motor via GPIO to stepper driver electronics, while reading in distance data from the LIDAR
  • Write out LIDAR data via UART to a host PC for visualization
  • Control the 4 main DC drive motors via PWM motor driver
  • Read in interrupts from the main drive motor encoders for velocity control
  • Configure IMU and retrieve robot yaw data via I2C
Components Peripheral Interface
STM32F4 MCU ---
BNO055 IMU Sensor I2C
LIDAR Lite Sensor I2C
A4988 Stepper Motor Driver (1 Stepper) GPIO
L298N Brushed DC Motor Drivers (4 DC Motors) PWM
4 Quadrature DC Motor Encoders GPIO Interrupts
UART to USB Converter UART

Self Balancing Robot

The goal of this personal project was to build a self balancing 2-wheeled robot. I did the following:

  • Full robot design in CAD, and fabricated with 3D printer
  • Firmware driver to configure and retrieve 3DOF orientation data from IMU via I2C
  • Firmware to interface the 2 main DC drive motors via PWM motor driver
  • A PID controller which takes as input the robot pitch and outputs desired motor voltages to stay upright
  • Power electronics. 12V battery and buck converter for powering the electronics

You can see the robot balancing itself in the video to the left. It is able to stay upright, albeit a bit wobbly. Future improvements to stability would include making the robot taller and incorporating motor encoder data into the control system.

Components Peripheral Interface
Arduino Mega MCU ---
MPU6050 IMU I2C
L298N Brushed DC Motor Drivers PWM

Roborodentia Robotics Tournament


This robot was my entry for Cal Poly's 2018 Roborodentia Robotics Tournament. The goal was to build a robot which can collect Nerf balls from dispensers and shoot them into a net on the opponents side of the table.

I worked with a mechanical engineer who designed and 3D printed most of the robot, while I was responsible for electronics and firmware.

We placed 2nd in the tournament. The demo video to the right is of one of our test runs during development.

I did the following:

  • Selected Arduino Mega as MCU
  • Firmware for interfacing with 2 IR ToF sensors
  • Position estimator based on ToF sensor depth data
  • Firmware for interfacing with 4 bump sensors
  • Firmware for robot's main drive train - PWM for DC drive motor and PWM for caster servo
  • Firmware for robot's shooting mechanism - brushless DC launch motor control and ball intake motor control
  • Power electronics. 12V battery and buck converter for powering the electronics
  • Robot's autonomy (ie. align with dispenser, retrieve balls, drive to the shooting position, launch balls into net, repeat)
Components Peripheral Interface
Arduino Mega MCU ---
2 ToF Lidar Sensors I2C
4 Bump Sensors GPIO
Servo Motor PWM
ESC Brushless Motor Driver PWM
2 L298N Brushed DC Motor Drivers PWM
Buck Converter ---

Roborodentia Robotics Tournament

This robot was my entry for Cal Poly's 2017 Roborodentia Robotics Tournament. The goal was to build a robot which can retrieve rings from horizonal pegs and place them on vertical pegs to score. There was also a flag that your robot could flip to gain a 1.5x score multiplier.

I worked with a mechanical engineer who designed and 3D printed most of the robot, while I was responsible for electronics and firmware.

We placed 1st in the tournament. The video to the right is of our final winning match.

I did the following:

  • Selected Arduino Mega as MCU
  • Firmware for retrieving measurements from IR LED array
  • Firmware for interfacing with 4 bump sensors
  • Firmware for robot's motors - 2 DC drive motors, 1 servo for the arm mechanism, 1 servo for the flag flipper
  • PID controller that takes as input robot position over black line, outputs motor voltages
  • Power electronics. 12V battery and buck converter for powering the electronics
  • Implemented robot's autonomy (grab rings, flip flag, following line to the scoring pegs, move the arm to dump the rings and score, repeat)
Components Peripheral Interface
Arduino Mega MCU ---
IR LED Proximity Array I2C
2 Servo Motor PWM
L298N Brushed DC Motor Drivers PWM
4 Bump Sensors GPIO
Buck Converter ---

FIRST Robotics

I was the cofounder of a FIRST Robotics team from 2012-2014, building robots and competing in tournaments throughout California. We did technology outreach in our local community, documented our development process and experience in this blog, and even made the news a couple times.


About This Site

This static page was built from scratch with HTML and CSS (no templates or static site generators). My domain name was acquired through Google Domains, and the site is hosted on a DigitalOcean Droplet via Apache Web Server.