I read cover to cover, disengagement is only mentioned as a needed item twice, but no mention of how to pull it off. They reused a trimble Ez steer motor for the tests, and did spend a disproportionate amount of time back calculating why it works. Very good report hopefully they got a good mark on it at school, AOG is light years ahead of this project, other than the camera guidance portion.
I was part of this group, this was our final year mechatronic systems engineering group project at university. Mechatronic systems engineering is a bridge between mechanical, electrical, and software engineering. Weâre not experts at anything, we just make the different pieces work together.
The instructors wouldnât let us use an ez-steer motor because our project wouldnât have had enough âmechanical,â so we basically made a clone out of a similar size DC motor with encoder. We did follow the same DB9 motor pinout and our system could work with an ez-steer without modification. The âdisproportionate amount of time calculating why it works,â this was the end of the project, the cool stuff was done, and we were just creating pages of content to match the marking scheme We received top marks.
Its been 5 years since I worked on this project but hereâs the best I can remember. We achieved disengagement in two ways.
- First way, we monitored motor current through the BTS7960âs current sensing function. Basically the same as what many are doing with the ACS712, but the BTS7960 speed controller has that hardware built in. That kind of worked, but as many here have found its difficult to find a happy cut off point thatâs easy to manually override but doesnât have false positives during heavy turning. We had to leave the cut off point fairly high.
- The second way was through monitoring the I term in the motorâs PID position controller. That worked incredibly well. It didnât really require any fine tuning at all, it just worked perfectly right from the first try. I think we just had a simple IF greater than X for xxx mS statement. This functionality requires an encoder on the motor and no one is really doing that yet with AoG.
- Trimble also uses both current monitoring and the encoder position somehow. I theorize that most of the disengagements are from the encoder.
I never ended up using the project in the field. I wrote the vision processing code in Matlab and other team members deployed everything onto a Pi, I never learned python so didnât continue development after graduation. Balancing farming and a full time job didnât leave much time for projects like this either.
Hereâs the final system at presentation time removed from the tractor. A âtabletâ that houses a touch screen, Pi 3B, arduino, BTS7960 speed controller, and 5v converter. A DIY EZ Steer clone (it tipped out of the way and everything), and a simple inexpensive USB webcam.
AOG is light years ahead of this project, other than the camera guidance portion.
I donât think it would be that difficult to re-write the camera guidance portion as a stand alone C# application that outputs a steer angle to conventional AoG electronics, but Iâm hoping with RTK I wonât have to.
Iâve been piecing together parts to build my first AoG system this winter. I hope Iâll be able to contribute on some sort of development once Iâm up to speed and have some AoG experience.