Communications Options

Some scenarios warrant consideration of of a wireless capability, such as;

  • Receive the day’s weather forecast, to plan power consumption and the day’s objectives.
  • Receive warning of an impending high wind or hale situation for it to return to a shed or other emergency sheltering place.
  • Receive instructions for assignment update (e.g., “shift to cultivating the rows of bell peppers”)
  • Communicate a heartbeat or warning/error status
  • Communicate an image of a new unknown plant/object (or access an online plant identification app to automate the learning process)

If there is to be any wireless communication with a local monitoring unit and/or the outside world, options include;

  • WiFi:
    – Pros: high bandwith
    – Cons: short range (when considering field/pasture acreage). Some range extensions are available, though may still be insufficient depending on the acreage, layout, and/or interceding hills

  • Low Power WAN (unlicensed LPWAN), with examples including;
    MyThings
    LoRa: (non-continuous, only wakes at specific times [LoRaWAN limitations])
    SigFox: (a maximum packet payload of 12 Bytes, and a number of packets per device that cannot exceed 14 packets/day)
    Pros: up to 10km
    Cons: Low data rate (~10kbps at 1km)

  • WiFi HaLow
    –Pros: up to 1km range
    –Cons: Lower frequencies have lower bandwidth (e.g., 1 km range at ~100kps)

  • SDRs (example: Lime Mini)
    –Pros: High flexibility in frequency choice
    –Cons: Additional configuration required

1 Like

I’m certainly interested in some kind of SDR that can do multiple protocols:
https://limemicro.com/products/boards/limesdr-mini/

But I don’t know what kind of bandwidth to expect from that. I have been curious about LoraWAN too. I have experience with low data rate sub ghz systems and I really like the range they offer.

We’re using WiFi for now with an Ubiquity Unifi mesh network. It’s a mild PITA but at least it is flexible.

I’ve just started drawing video data off of Acorn today from the new camera system, and some kind of high bandwidth link will be important. I’ve been using high powered 2.4GHz wireless so far due to the range advantage, but I may want to add 5GHz as well for speed when available.

We do require a heartbeat to the server or Acorn will stop driving. It’s a good indicator of issues when I see those stops in the error logs.

Also one key part of the wireless is for RTK GPS correction data. To accurately solve for its GPS location it must have periodic updates. I’m not sure what update rate is strictly required but if GPS base station data is 20 seconds old we stop driving, though I think that could be relaxed to maybe 5 minutes.

The Lime Mini looks very intriguing.

What are the use cases for video, or is that really just for early prototyping?

If video is deemed necessary, what resolution and frame rate are considered minimums? Doing object detection outside at home, I really only use 2 frames/sec FHD. I can record at much higher rates, of course, though what the rover needs will depend on the use cases.

Well I think we will want to use higher resolution imaging when building the dataset, even if we train at lower resolutions. This will allow us more flexibility, and allows for some interesting other post processing work. So for example I recorded two 4k 15fps videos from the onboard cameras for 100 meters of travel, and while I can set the bitrate to whatever I want that capture at the moment added up to 2.4GB.

I have this idea that we will some day be able to create high resolution 3D reconstructions of the ground Acorn travels over, which could provide clear photographic views of everything at the farm close up. This would be useful for labeling images and just keeping an eye on things.

And then there is the question of whether we want to do ML processing onboard (I think yes) or stream video to a high powered server nearby for recognition. If we do the latter we would need a very good wireless link.

At a minimum we need to be able to get lots of high resolution data off of it for labeling. Dataset creation will never be finished, so I see it as an important function in the field. However there may be cases where a fixed model is “good enough” and only a low bandwidth connection for RTK corrections is needed.

Some of the options for building the data set;

  1. High speed wifi
  2. Capture to SD card (no high speed comm required)

When you say ML onboard, shall we assume you are always referring to inferencing, for those who may be reading this? IMO, a strong case could be made for exactly that, as increasing levels of autonomy make for a more resilient agribot that is not tied tightly to the base station for every little thing.

I agree about the “good enough” and low bandwidth connections for both RTK corrections and new plant object ‘discoveries’.

Well, given that the two premiums cost wise are bandwidth and data storage, data storage also increasing processing cost, there is always he age old question of how much is enough.

4k is likely massive overkill, though it’s easier to start with overkill and then produce increasingly lower bitrate datasets from the same original high bitrate set and train identical models off of then until a minimum resolution and bitrate is identified. Likewise 60fps video may be overkill vs a high shutter speed high resolution image at say 2fps.

Communication too. A few hundred GB of storage is cheap, even 1-3 TB, but transfer at high rates and distance is not. Automatic retrieval is preferred, so a combo of an overseer network using a low bandwidth long range system and a high bandwidth short-range 5Ghz WiFi for when near the home station is maybe the best of both worlds.

1 Like

Yes inference onboard.

4k is only 8.5 megapixels and depending on your field of view it ends up not being a whole lot of pixel density. I am currently using wide angle lenses so you can see plants from multiple angles and see the front of the robot as well as the rear where tools will be in one image. This will allow for good automatic synchronization between tools and cameras (“hand” eye coordination).

I have four 4k cameras on my Rover robot and while it is a LOT of data and often is not needed, there are times where you want a closer look at something.

If you’re a farmer and you’re in the field and you see something odd, you’re going to bend down and look more closely. Unless you’re going to physically move the cameras or add optical zoom the only way to get a closer look is to be able to digitally zoom in, so to me that means higher resolution sensors that you often run at lower resolution.

My personal camera from 2017 is 20 megapixels, and it can shoot 6K video, which I have found helpful for some machine vision (photogrammetry) experiments.

That said the optics matter just as much. Right now we are using these relatively low cost machine vision cameras, and it is obvious to me that the biggest limiting factor is optics:

We have basically the first prototype of a vision system onboard and it is going to be an evolving design as things progress.

Check out openMV https://openmv.io/
MicroPython based machine vision platform. Camera included, Cortex M7, Arduino compatible. Also supports tensor flow and other DL methods.

I use MicroPython almost exclusively for automotive and IoT commercial applications. It rocks!

For enhanced navigation, you could borrow something I saw used by drones. Tie colored ribbons at various locations. These blobs are easy to spot.

2 Likes

Sending video via RF will be a big problem, especially in agriculture. You simply can’t send WiFi over more than a couple of hundred feet, not to mention the power drain on your batteries. Adding APs can help, but only if you have a BIG battery or AC power. Long range RF links like LoRa are simply too bandwidth limited. So it’s best to have as much on-board video processing as possible. Photo images of crops can be stored to flash for later download.

2 Likes

Excellent summary @WillStewart . I design telemetry for agriculture. It’s tough and and there are a lot of tradeoffs. You have to worry about minimizing power, and maximizing battery life (no AC plugs in a corn field) and it has to be able to stand up to some pretty brutal environmental conditions. But that’s what makes it fun.

As a general rule, longer distance means low data rate, sometimes under 1K baud (don’t believe the marketing claims). I’ve been using proprietary 900MHz RF modules and now starting to use LoRa in P2P topology. 1200-9600 baud is the sweet spot. Also there are FCC regs you have to make sure you comply with. Even LTE CAT1 IoT is limited to under 200Kbaud and is VERY expensive–charged by the megabyte. Yikes!

So autonomous operation is quite desirable.

2 Likes

@bradstew , your ag experiences are invaluable. My comm summary is based on specs, seat of the pants by some ag users, and generalization. Any tips on correcting/clarifying any points/ranges/rates would be very helpful.

1 Like