Disclaimer: this is not specifically for a commercial product, but various things I design sometimes get commercialized. I mention this so that you may decide whether you want to weigh in. If it’s commercialized, I will probably make very little money but a bunch of university students may get a neat STEM program in the countryside :D

That out of the way, I’ve designed some boards for a Wi-Fi controlled robot with mechanum wheels. So 4 independent motor drivers, one for each wheel, allow omnidirectional motion. It’s built around a Pi Pico W, 4 SOIC-8 9110S motor drivers, and some buck/boost converters to give the system a 5V and 12V line. It’s very basic, mostly made to be cheap. Here’s a photo:

Right now it just receives UDP communications (a little app written in Godot) and activates the motors in different combinations – very “hello world”. I’m planning to add some autonomy to move around pre-generated maps, solve mazes, and so on.

I have foolishly used 2-pin JST connectors for the motors, so using motors with rotary encoders would be a pain without ordering new boards. I’ll probably fix that in a later board revision or just hack it in. Also the routing is sloppy and there’s no ground plane. It works well enough for development and testing though :D

What I’m thinking about right now, is how to let the robot position itself in a room effectively and cheaply. I was thinking of adding either a full LiDAR or building a limited LiDAR out of a servo motor and two cheap laser ToF sensors – e.g. one pointed forward, the other back, and I can sweep it 90 degrees. Since the LiDAR does not need to be fast or continuously sweep, I am leaning toward the latter approach.

Then the processing is handled remotely – a server requests that the robot do a LiDAR sweep, the robot sends a minimal point cloud back to the server, which estimates the robot’s current location and sends back some instructions to move in a direction for some distance – probably this is where the lack of rotary encoders is going to hurt, but for now I’m planning on just pointing the forward laser ToF sensor towards a target and give the instruction “turn or move forward at static speed X until the sensor reads Y”, which should be pretty easy for the MCU To handle.

I’m planning to control multiple robots from the same server. The robots don’t need to be super fast.

What I’m currently wondering is whether my approach really needs rotary encoders in practice – I’ve heard that mechanum wheels have high enough mechanical slippage that they end up inaccurate, and designers often add another set of unpowered wheels for position tracking anyway. I don’t want to add more wheels in this way though.

On the other hand, it would probably be easier to tell the MCU to “move forward X rotary encoder pulses at a velocity defined by Y pulses per second, and then check position and correct at a lower speed” than to use a pure LiDAR approach (e.g. even if rotary encoders don’t give me accurate position, on small time scales, they give me good feedback to control speed). I could possibly even send a fairly complex series of instructions in one go, making the communications efficient enough to eliminate a local server and control a ton of robots from a cloud VPS or whatever.

Anyone have some experience with encoders + mechanum wheels that can offer a few tips my way? At this stage the project doesn’t have clear engineering goals and this is mostly an academic exercise. I’ve read that using a rigid chassis and minimizing the need for lateral motion can reduce slippage, reading through a few papers didn’t get me any numerical indication of what to expect.

  • Sieguito@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Even though you add encoders on the wheels it’s still an open loop locomotion method, you need the room to have a feedback of the position on the plane/in the space

  • rufus@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    I think mecanum wheels slip quite a bit. So I’m not sure how effective those encoders are. But I’ve only ever tried 3d-printed ones. So I’m not super sure.

    You’re sure your STEM students are ready to handle the LiDAR? Manage point clouds, do the arithmetic, path planning etc? We had a practical course with little robots. But they had 3 of those sharp distance sensors at the front and a bumper with a switch. This was enough to teach many concepts and also enough to implement for the students for something that was just a project and not a full time job. But I’m sure that depends on what exactly you want to teach…

    And our robots hat the motor drivers (h-bridges) replaceable on socket terminals because every so often someone wasn’t very clever or didn’t listen in the lectures.

    • Saigonauticon@voltage.vnOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      They’re university software engineering students, probably a year or two into their degrees. I’m hoping to provide the robots as completed units that are controlled via API, because we’re not likely to get many students with electrical, mechanical, or embedded backgrounds. You’re right about the complexity though, and that’s something I’ve been thinking about – I guess I’ll start out with a bit of optimism regarding their talents, and scale back if needed :D

      I don’t really have a scope, budget, timeline, or audience properly defined for this project – in short my client has a STEM program for building and interacting with digital maps, but it’s way too boring and I’m determined to breathe some life into it. So I’m going to have to play a lot of things by ear.

      • rufus@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Well, kids / young people / students will surprise you anyways. No matter what you planned ahead. I think teaching this way just requires you to stay flexible and try things with the students and see what works. University students will benefit from a little challenge, but it shouldn’t be impossible and get them frustrated. I’ve never taught myself, but I bet it’s difficult to hit that balance.

        Programming little robots is awesome, though. I think it’s on a whole other level to see robots move and do tasks, than to look at your screen and program something that changes a few pixels there. My university course was more related to embedded devices and closer to the electronics. It teaches you valuable lessons when forced to interact with some electronics, real-world physics, constrained resources and you need to get your maths right. Usually students are concerned with something like Java, learning object-oriented programming or handling some big frameworks. Or learning maths. And robotics teaches you to really pay attention, combine different skill-sets and do things without an easy route available.

        Maybe it’s just me who likes electronics too much. But I’m sure the kind of motivation you get by watching a real robot move and it runs your code, is unique. And kind of universal. You can do this in pre-school or in university to spark their imagination and motivation.

        Your task is a bit different. If you’re teaching something like simultaneous localization and mapping and the students also have to deal with all the robotics, sensors and real-word problems, this might be more of an ordeal for them than fun. Even dealing with noisy sensor values is a hassle until you get to grasp the bigger picture.

        If you’re giving them access to an API, you can choose and adjust what kind of abstraction you’re providing them. Give them something high-level or have them do more work. You could prepare most of the implementation and adjust the level of detail while teaching. Maybe skip something and give them working code via your API so they can focus on the problem they’re actually supposed to learn. You can also do it the other way round. Let them start with all low level stuff handled for them and learn the big concepts. Then let them dig down and see what your API functions have abstracted away until then. This way around you won’t run out of time.

        I’m sure including actual robotics is going to get them more motivated in contrast to running a simulation.

        • Saigonauticon@voltage.vnOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Haha, I know exactly what you mean – I’m most interested in resource-constrained embedded systems. I like the attiny10 a lot. At work I mostly write Python, but in my own time it’s mostly assembly language. It feels more concrete, every decision matters, and anything that goes wrong is 100% your fault as there are relatively few bugs at that level. It’s a lot of fun. Also the datasheet is very good.

          I’m self-taught with all the electronics stuff, I paid for it by teaching a course on whatever thing I did most recently. Then I’d use the proceeds to buy tools and parts for the next big (often dumb) idea. I’d also ask for the software engineering assignments from colleagues in those programs, and complete them in my spare time. It was puzzling to a few people why I would want to do assignments, and indeed some were very boring (oh god Java + Spring framework) but others were quite interesting (formal study of algorithms). Sadly, economic reality kicked in and I had to run a company instead of pursuing my education further (I still try to do one ridiculous engineering thing per year though).

          I guess there’s a real risk (…like 100%) that I overestimate the motivation students have – so I think I’m going to take your advice and set the level of abstraction with something API-like to abstract away the low-level components (this is closer to my client’s domain). I’m imagining a robot that acts as a WiFi access point, and having something “like an API” that works over UDP packets that describe high-level functions. Then start with something simple – like a digital map with known starting location, and a small obstacle course that can be completed with simple distance measurement, no point clouds. If that goes well, I can develop towards more complex material – probably not full SLAM, but maybe localization on a pre-mapped surface. I have plenty of my own code as examples of how to do simple UDP communications in Python, I could expand it into a custom library.

          Sort of like Logo from 1983, but with a physical robot and sensors. I’m a little to young to have used Logo, but the computer lab in my school was really outdated so I got to try it once :D

          • rufus@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Well, there’s also turtles to program in Python (i think) and there is Scratch.

            I guess there’s a real risk (…like 100%) that I overestimate the motivation students have

            Definitely sounds like it. But a motivated teacher is a very good thing. Maybe you’re able to get that spark across to some of the students.

            API-like to abstract away the low-level components

            You can always have some extra assignments ready, just in case someone is curious and wants to do/know more. A room full of studens will have a mixed amount of knowledge, abilities and motivation anyways.

            I’m most interested in resource-constrained embedded systems. I like the attiny10 a lot.

            I also ate a few books and datasheets on the Atmel chips in my lifetime. Their design is well-thought-out and probably an excellent subject to learn the concepts about microcontrollers.

            As of now I like the ESP32. It is ridiculously overpowered if you’re used to something like the ATtinies or old ATmegas. With (at least) 520kB of RAM, two cores that work at 240MHz (depending on variant) and very nice peripherals. Also WiFi connectivity is really useful. But it definitely adds to the fun if you programmed the more constrained (previous generation of) microcontrollers and know how spoiled you are and can feel like a supervillain wasting hundreds of kilobytes of memory deliberately. Or (ab)use some of the peripherals for things that wouldn’t be possible with the few timers available on the Atmel chips. Or do trigonometry at crazy frequencies for your robots, because you can handle 32bit floating point numbers. But I’d agree, that doesn’t teach you the same things if you can do floating point arithmetics for cheap and don’t know if calculating a square root is an easy or difficult thing to do. The STM chips also have nice peripherals. But I haven’t really fiddled around with those.

            Definitely hope you’ll have fun being involved in that STEM program.

            • Saigonauticon@voltage.vnOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              10 months ago

              Good advice all around! Thanks!

              I’ve also messed around with the ESP8266 and various models of ESP32. Their WiFi time-of-flight stuff is interesting. I’ve quite a few projects with both actually! My main complaint is that the GPIO don’t behave nicely (also the esp8266 is a power hog and reboots if you screw up the network stack). They are much slower than I’d expect, and have weird states on boot. It’s not too bad to work around this stuff, but I chose the Pi Pico W so as not to have to explain it.

              It still blows me away that I can easily do public-private key encryption on the ESP32. And graphics. At the same time!

  • BlueAure@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Assuming that there is at least some amount of slippage between the wheel and ground, it seems to me that you’ll need to regularly check the ToF sensors anyway. I’ve found that encoders are fantastic for a lot of things, but not so much for measuring distance because of the problems you’ve described. Perhaps a recurring local check on a reduced set of points to verify location then forward the full cloud less often for further remote processing? It really sounds like you have a tradeoff depending on whether you value accuracy of location or accuracy of wheel rpm (analogous to speed). Using both would give you a nice way to calculate the ideal motor rpm to minimize slippage in a surface agnostic way.

    • Saigonauticon@voltage.vnOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Ok yeah – I’m leaning toward relying more on the laser ToF than the rotary encoders.

      A simple algorithm of ‘pick a lidar point and drive toward it’ does sound simplest. Thanks for weighing in!