Skip to main content

Learnings from FIRST Robotics competition

I recently participated in FRC by joining RFactor, A team based in Mumbai. Although I lived in Goa, I would travel to Mumbai during my holidays, especially during Diwali, Christmas and summer holidays. Despite this restriction I persevered by spending 16-18 hours every day at the lab. With this commitment I was able to contribute hugely to the team in the areas of Programming, CAD and prototyping.

Me and Viraj together managed the entire programming aspect and coded the swerve drive of the robot. We used pathplanner and YAGSL for this and were able to make a highly accurate swerve. Using a pathplanner meant that our robot's autonomous code was very accurate and could correct itself at any moment using accurate odometry. 

This meant we could change the autonomous path right before the match and adjust our autonomous to our alliances. Due to our accurate odometry and pathplanner our autonomous would never fail and always deliver us points. We used paths of 1+1 and 2+1 based on our alliance and how many notes they were doing.

We also used Computer vision in the form of limelight and photon vision on a RPI.  We put note detection algorithms on the limelight using a google coral and put april tag detection on the RPI using photon vision. We created alignment codes to align the robot to the note for intaking and to align the robot to the speaker before shooting. 

Since the cameras gave us the angles of the target we had to use trigonometry to create formulas for converting this into the distance that the robot should travel.

Here is the trigonometry we used for converting the angle of the camera to the april tag into the angle of the robot. However we later changed this and instead tilted the camera upwards and took the distance between the camera and the april tag. 

In the end our code would do this: 1. Look at the speaker, 2. Go right up till the speaker(the perfect distance for shooting a note) 3. Stop and wait for the driver to give the shoot command.

Apart from these two major aspects of our code we also coded all the subsystems and mechanisms in the robot. This included the intake, loader, shooter and arm. We interlinked these mechanisms to work together with each other. For eg, When activated the intake would run at the same time as the loader and once the note was intake and reach the limit switch of the shooter; the loader and intake would turn off and the note would be kept within the shooter. Once the driver wanted to shoot, the shooter would slowly rev up to maximum speed and once it reached there the loader would automatically shoot the note out. 

We also fixed the code and polished it to be efficient and easily usable so that future members of the team would be able to use our code too.

Computer Aided Design: 

As for the CAD, Me and Viraj redesigned the entire CAD of the robot, made it accurate and took renderings of it.

Initially the CAD was completely un-organised and had many duplicate files, interlinking errors and other inaccuracies. We rebuilt the entire CAD from scratch in an efficient and neat manner. We used this opportunity to finalise all the measurements and lengths along with all the specific angles so that the robot could intake the notes and also feed them into both the speaker and the amplifier.

We also simplified, modularised and made the mechanisms simpler to manufacture while being stronger and efficient. For example the shooter I designed it to be made of 7 metal L’s and metal plates making it very sturdy to be able to withstand the high rpm of the shooter’s motor.

We also took many renders of our CAD for the engineering handbook


Apart from this we also were a part of the prototyping and building stage. We tested the early versions of the shooter and also made sure all the mechanisms were interlinking with each other. We also did the electricals several times in the test versions of the robot and also contributed significantly to the wiring of the final robot.

Comments

Popular posts from this blog

My Learnings from FTC

In my journey, in FTC I learned many things and came across many different challenges. I had to overcome all of these challenges and I learnt a lot in the process.  One of the major things that I learnt was 3d designing. Before I had some minor knowledge of using Fusion 360 but I learnt a lot more during this competition. As the only 3d Designer in the team I had to design many of the major parts we needed for the competition. Here are a few of the things that I designed. Battery Holder To hold the battery to our robot, we needed a holder to store it and help us attach it to the robot. To start, I took the battery measurements by the scale and created a sketch to be the base, then extruded the walls and the base. I made a slot for the wire of the battery to be able to come out.  With the rough design of the battery holder down, I started the refinement of the build and filleted down all the sharp edges. However, if I were to re-design this part, I could count this step as a mistake as

Brahma of Goa

Over the last few weeks, as a part of my participation in the E-Yantra Virtual Museum competition, I have been on a journey to discover the history of one statue and its entire story. I have scoured the internet searching for leads, visited temples, talked to priests and gone around Goa to find out about this one statue. This statue is a sculpture of Brahma, a Hindu god. It is chiselled black stone and originates in the Kadamba era (12th century). It is decorated with many details, stands tall, and is bearded with Kadamba-style designs. The figure carries a ladle in its upper right Hand, The Vedas in its upper left hand. Kamandal in its lower left hand and chanting beads in its lower right hand.  Now let me walk you through my story about this statue. First, I saw images of the statue and then tried searching for information on it online. I found a few excellent articles, but those still wouldn’t give the full details. After a lot of scouring around the internet, I found one article w

AllergyAlly a mobile app

  AI-powered allergy companion app that revolutionizes allergy management. The mobile app was developed as part of submission to the  Global AI Hackathon  using the  MIT app inventor platform  by me and my teammate Viraj Marathe. UPDATE: The Allergy Ally app secured the Honourable Mention Award at the Global AI Hackathon. The World Allergy Organization estimates that 20–30% of the global population experiences some form of allergy. Impact on Quality of Life by Allergies: Missed days Poor Sleep Dietary Restrictions Mental Health Medical history remains a cornerstone in allergy diagnosis, guiding healthcare providers in making informed decisions and providing appropriate care to patients. And this app documents each and every allergic incident and provides detailed report for the healthcare providers with the help of AI. Existing solutions are not satisfactory and our mobile app Allergy ally acts as a companion to persons troubled by allergies. Features of the app User profile creation