Software stack: why not ROS?

Hi Taylor,

i’d have a quick question: what’s the reason behind you didn’t go for ROS with the software?

Cheers

1 Like

Acorn will be ROS compatible. We are working hard to make the “firmware” level system lightweight and able to run on the smallest processors. But the high level systems will all be ROS compatible.

Hi Marco!

Well we could have used ROS as I have enough experience with it (two different projects, 19 months total full time as a ROS Software Engineer).

I did start using ROS at some point on this project, for a little while. I found a nice four wheel steering controller I wanted to use. But the ROS package that implemented it supported a whole load of different configurations, and consequently was pretty complex. The guy gave a one hour lecture on the architecture of the package! I never quite understood it and finally pulled the two functions I needed out of the C++ code and just ran them in python with a few small syntax changes. Later I found a whitepaper on the required math and rewrote those functions from scratch to add strafe capability.

The benefits of ROS are: interprocess communication, controlling application launches, and a nice big library of ROS drivers for all kinds of off the shelf LIDAR units, sensors, and other robot bits, plus easy connections to software packages like Octomap, various 2D mappers, navigation controllers, etc.

Well on the first two points, Python already offers facilities for that so on those points alone it is not an advantage. As far as standard drivers, I didn’t foresee using any hardware that required any of the standard drivers. We’re not using any LIDAR for example. And similarly most of the nice packages like 2D mappers are not immediately useful to us without a fair amount of customization. And most of those programs are designed to run with or without ROS so if we are going to use them and customize them we could just patch them into python.

Also in 2019 (I don’t know about now), installing ROS on Raspberry Pi was complex, and it was best to use images from folks at Ubiquity Robotics in Silicon Valley, who I am friends with (I donated and dropped off an early Raspberry Pi 4 with one of their members so he could test Pi4 Compatibility). That meant you couldn’t use a base Raspbian Install, but this linux image that had its own quirks. They couldn’t support new hardware until after it was released, available at retail, and one of their volunteers got one and messed with it. It is wonderful work but a dependency I would prefer to avoid. And at the time ROS did not support Python3!

Finally I must admit I am very interested in the idea of a “pure python” robotics stack on Raspberry Pi. Largely because traditionally robotics is very compute heavy stuff written in C++ on a beefy computer, and Python on Raspberry Pi feels like some kind of Zen version of it all. Since I am planning a computer vision heavy system we will need another more powerful computer onboard anyway, but all basic and low level coordination can be handled on the Pi. That’s all that is required for GPS based autonomy and a host of other nice features.

As Daniel says above we can interface with ROS systems, but so far I haven’t really felt the need to run it onboard.

1 Like

Check out the OpenMV project running MicroPython. It’s a very fast Cortex M7 with camera. Much lower power than RPi with similar performance (no OS to get in the way). Instant on and sub 100uA standby. Arduino interface.

2 Likes

Thanks! The RPI is working great for us and its power consumption is small compared to the motors, but I will consider that if we want to move to something lower power. :slight_smile:

I wonder at using Python though on computationally intense processes. For tool management and modular tasks it makes a lot of sense due the ability to quickly change and test, but a code like C+ or Rust allows for such an increase in efficiency that for tasks like inferencing it’s hard to see Python as the choice other than ease.

EDIT: Having worked with C+, C#, and Python/Jython for various things from basic coding, API interaction and Excel<>API scripting for FEA for motors, and for scripting.

2 Likes

" This page describes how to access to the TensorFlow Lite interpreter and perform an inference using C++, Java, and Python, plus links to other resources for each supported platform."

TensorFlow Lite inference

@seancf , does this address your concern?

1 Like

Thanks for the links @WillStewart. Even in the document it points to the performance advantages of a compiled language (C++) vs interpreted languages (Java, Python).

1 Like

The thing to remember is that python libraries are commonly written in C++. Python itself is written in C! The key to writing performant python in addition to optimizing the python code is to run any performance intensive operations in C++ or similar. For example for it is generally recommended to use the numpy library for any large matrix or array operations, and this can be done without leaving python.

For more intensive operations like inference, we would use a library like Nvidia’s Deepstream which has both C++ and Python interfaces available.

Basically: write everything in python except performance intensive operations - offload those to appropriate libraries. For me this is much better than writing everything in C++. I think accessibility to developers is one big benefit of Python, and I am eager to learn more about making it high performance.

I watched a different lecture on this in the past, but this video on the subject seems worth watching:

2 Likes