I used 2 simplertk2b xlr, set up my own base station, I used Linak actuator for front and rear steering (750N) for WAS RQH100030, wheelbase:150cm., track:100cm., Hub motors 3 kW. power of 2 units (6 kW), 6 units of 30 A. 12V. battery (72V. 30 A.) BNO085 and UDP for the whole system, my aim is to put a 600 liter tank on the vehicle and make a spray boom using aluminum sigma profile that opens and closes with a 10 Meter long linear actuator. Fine adjustments are in progress now, I plan to do obstacle detection with ultrasonic sensor, Lidar and depth camera in the future, using rate control flow meter or by installing PWM spray nozzle (5-50 Hz.), This tool has been my dream for years and thanks to you, I succeeded. that you are. Best regards.

Thank you so much to everyone who helped me and I forgot to write their name, I love this group very much thanks to you, everyone is respectful and everybody is very knowledgeable, @Brian Tischler, Muammer EKREM @Ray, @Jorgensen, @Weder, @Math, @Matthias Hammer, @Gianluca, Andreas Ortner,
@Benjamin, @Potato Farmer, @Baraki, M_Elias, Vili, Kent Stuff, Damien, BlueRabbit, Torriem, GoRoNb, Bennet, Pat, Larsvest, CommonRail, KansasFarmer, Kaupoi, Alan Webb, Jcmach.

1 2 3 4


Very cool, many applications beyond spraying as well.

The cut quality of the parts is quite high, were you using water-jet or plasma?

1 Like

Heiii, Great work. Really professionell! do you have flat fileds, so will it have enough Power to Transport 600 l of water? Have you already looked into how to detect weed for example?
Maybe Sensor to detect barriers? Like the project, keep us updated:)

Regards, Peter

1 Like

Thank you, 6 kw fiber laser belonging to a friend.

1 Like

Hello, 500 Kg. I’ve done my experiments with a load up to 10 km / h no problem, 10% climbing value. I’ll just replace the wheels with a wider base and sprocket wheel. I started to work on weed with artificial intelligence and R-CNN -tensorflow-nvidia cuda-opencv and raspberry-pi + camera as hardware.I will use LIDAR and ultrasonic for obstacle detection.


When its not spraying crops, you should teach it to pick recyclables for deposit. If you could wake up every morning with 500kg of pop cans delivered, you would not need a day job.

I am assuming there are litterbugs where you are as well.


This looks great!

How are you getting on with modelling? I collected a few thousands photos last year of dockweeds and built a few models based on MobileNet - Weed detection test 2 (getting better) - YouTube (see also fast.ai thread where people discussed models Grassland Weed detector - #94 by ptd006 - Deep Learning - Deep Learning Course Forums )

Is Raspberry Pi really fast enough for real time recognition (I needed Jetson Nano).

Seeing your project and just talking to my dad (who’s been out spot spraying) has got me quite enthusiastic about this project again. Like AgOpenGPS is for autosteer I think “community” alternatives to Bilberry, JD/BlueRiver would be nice.


I have thought that a system like this would work easier if you just give it a bunch of waypoints in a field, or garden for proof of concept. It would be less programing trying to make an autonomous rover and focus of the weed control aspect.
using a program like ardupilot the mission path would be easy enough to set up and using a ardusimple with rtk the path would be accurate enough to navigate a garden.
I would be very interested in building something like this but wasnt sure how to find a program that looks for weeds and can pinpoint a location program an arm to go to that location.


This is basically what I’m doing- using QGroundControl survey mode you can create the track very quickly (no need to think about individual way points).

For spraying the weeds- yes you could have a fancy arm with like a laser cutter or something (seriously, I did start talking to someone about that haha)… or KISS split up the input image into a grid and have a mini solenoid spray nozzle for each box (I have 8), triggered if you’re above a certain threshold.

I put some code up here last year → https://github.com/ptd006/WeedML
You can of course do much better but this was the easiest starting point I could think of.

I have an F9P unopened (M8T I played with last year wasn’t accurate enough) that I’m now motivated to try out. My robot is a joke compared to whiterose’s, which looks an impressive bit of kit :clap: :clap:


@ptd006 are you following the path created by QGroundControl with AOG?

No but that was the original plan (when I first started AgOpenGPS fixes on Linux). I changed to ArduRover like Mueller mentioned running on a Pixhawk (which is vaguely like the standard AgOpenGPS PCB but with a more powerful processor and doesn’t need a separate PC).

1 Like

I was thinking an old self propelled snow blower, or an old walk behind garden till would be a great starting point for the rover, strong axle and differential, and make a center pivoted for steering, that would make a very simple and stable rover.

1 Like

Hello, thank you for taking care of my project, I will work with raspberry pi4 and google coral in the first place, I am thinking of using a rasp pi infrared camera. But among my plans to use jetson nano 4gb or jetson xavier in the next step, raspberry is pretty fast, thank you very much for your suggestions, everyone takes steps first, then walks and then runs … It’s very nice in your project, and I’ve gone through the challenges you went through, and when there is an improvement, I will definitely share open source. My vehicle is 100cm x 150 cm. , today I realized that when I enter these measurements in AOG, the vehicle is constantly wobbling, so I made the vehicle dimensions in AOG 200 X 300 and the wobble disappeared. The first reason that started me on this was JD-Blueriver, carbon bee, weedit, weedseeker. Respects.


Hi , very interesting your project . Congratulations. But you use a soft base agrabot or Agopengps ? For sample for launch go machine , go fast or go slow , etc … thanks

seen this video, kinda related but some pretty fancy hardware.

1 Like

Awesome video! But does in a high crop configuration?

I weed the vegetable garden with a torch. This is awesome !!!

Unfortunately, I do not approve of killing ants or insects by burning. He could do it by spraying herbicides, not by burning. But the system is very nice.

1 Like

Sprays are messing up bees, trees, and soil microbes. Desiccation is a hot button issue due to residuals left on food. But I admit short term they are convenient.

If your killing, your killing, you have collateral damage regardless. Nature uses fire, Monsanto uses chemical.

The world is in a tough spot, we cannot maintain the population without synthetically propping up crop nutrients. The land is constantly left in a damaged state nature tries to repair with weeds only to get doused multiple times in chemical. Monoculture allows pests an almost inexhaustible food supply to multiply with, only to be controlled with more chemical. All modern farming practices require a ton of fossil fuels for feed stock for chemicals and there processes, mining for nutrients and energy to turn its wheels.

There is good and bad in modern farming. I think the engineering challenges yet to be faced to secure a stable food supply without strip mining the planet are huge.

But the current eco driven government policy planners seem completely blind to how the world is actually fed. I think the anti oil policies and the food supply will butt heads in a big way very soon.

But this robot could use a solar powered laser beam, lol.


Friken lazer beams!! I love it

I think if a system like this was to be taken seriously and not made for youtube views the fire would be much more controlled.

Assuming the robot can control it effectively with an arm like that, could a large lens (magnifying glass) be more effective than a laser powered by PV cells?