= Room mapping robot. == Contributors Γαβριήλ Σίνγκ - cs131083 Ιωνάθαν Μπαξεβανίδης - cs161083 Κωνσταντίνος Χουτιούδης - cs141296 == Project Description This was an attempt at making a "budget" [Slam] (Simultaneous localization and mapping) robot-vehicle, with a time of flight Lazer sensor.It uses the esp32 as it's brains (tons of io pins, 2 core, wifi & bluetooth on chip,low consumption,etc.) and a arduino nano that acts as a slave controling the drive tracks by taking commands from the esp32. == What is SLAM In navigation, robotic mapping and odometry for virtual reality or augmented reality, simultaneous localization and mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it == Working principal The working principal behind the robot is based on dead reckoning meaning we create the map as we go assuming we've started from position 0,0.Using the tof sensor we get measurments from all directions (360 degrees) this happens using the stepper motor along with the hall effect sensors so we the degree of each measurment great so now we can bind these values (distnace,angle) to the robot's postion.The problem with our particular platform was space limitations and the fact that the tracks are moved by dc motors some solutions on these problems would be if possible driving the tracks by stepper motors directly or otherwise so we know exactly how many steps each track has made therefore deducing the robots position or using photo encoders in the small dc motors producing the same effect.(problems with bad are also very real and can throw off our measurments but hey we're not supposed to send this robot on mars ) Being able to determine our traveled distance and having 360 "accurate" measurments of the space traversed would enable us to construct an algorithm which by using these data could find a way to traverse the space creating a map along the way. == Demo video & Photos video::Tu37dVoCxmo[youtube,500,500] image:./pics/demo1.jpg[700,700] image:./pics/demo2.jpg[700,700] image:./pics/inabox.jpg[700,700] image:./pics/drawbox.png[700,700] == Parts - Esp32 - Arduino Nano - 28BYJ-48 Stepper - ULN2003 Stepper Motor Driver - I2C Logic Level Converter - VL53L1X - Hall Sensor - Slip ring - 2x18650 Batteries - Battery Protection Board - Model rc truck - L298N Motor Driver image:./pics/esp32.jpg[200,200] image:./pics/nano.jpg[200,200] image:./pics/stepper&driver.jpg[200,200] image:./pics/logic_level.jpg[200,200] image:./pics/sensor.jpg[200,200] image:./pics/hall.jpg[200,200] image:./pics/slipdisc.png[200,200] image:./pics/bat.jpeg[200,200] image:./pics/bms.jpg[200,200] image:./pics/truck.png[200,200] image:./pics/L298N.jpg[200,200] image:./pics/diagram.png[1000,1000] == Why a lazer sensor ? The main reason we went with a lazer sensor is because of the shorter time light sensors take to acquire measurements and we thought it would be more accurate, however it seems that the vl53L1x sensor isn't as accurate as expected. == Difficulties & things to consider - Acquiring location position in a room - Getting accurate data from budget sensors with low resolution - Calculating travel distance based on Dc motors - Maintaning electrical connections between the two rotating parts == Problems that managed to get solved === - Getting angle of measurment Solution: Using a Hall sensor and a magnet to find 0 degree spot & using a stepper motor with known gear ratio so we know the step angle === - Power consumption Solution: Used 2xcells 18650 batteries because simple alkaline batteries wheren't up to the task === helpful links: link:https://www.youtube.com/watch?v=fQ2iB7qkrUg[youtube homemade lidar]