A while back I posted about a side project I’ve been working for that I’ve been referring to as ARMOR, for “Augmented Reality Mobile Robotics.” I have this vision of an app that can visualize robot geometry and basic motion from files in standard formats like URDF, which is standard in the Robot Operating System (ROS). Extending on that idea, I’d like to add on modules for things like sensing and control. Aside from being able to view URDF on your phone, which in itself is unique, it will also add capability for viewing in augmented reality (AR), if desired, and with much more advanced graphics than current tools like rviz.
If you want to view the source code for this project, check out https://github.com/radcli14/BB25
That project is ongoing, and like many of its kind, growing in complexity by the week. In the meantime, I had the idea of integrating a high-fidelity simulation code, making ARMOR itself a multibody dynamics (MBD) app (a bit like MOMDYN, perhaps?). Along the way I learned about this little (actually, quite big) project called MuJoCo, for Multi-Joint dynamics with Contact, which has a community-supported Swift extension. MuJoCo has been picked up by some power players in the tech industry, in fact, its now owned by Google Deepmind, and powers NVIDIA Newton. I liked the idea of integrating this advance MBD code into ARMOR, but felt it was a bit of a large step to try to do it in the big app without trying it out first.
So, instead, I created a new, and smaller, which I called BB25, for BoE Bot in 2025. Rather than trying to generalize all of the features of URDF and MuJoCo to any possible robot configuration, this builds one type of two-wheeled, differential drive robot, specifically the Parallax Board-of-Education (BoE) Bot. I used Blender to create a near-photorealistic model of the BoE Bot, and exported to USDZ, then used RealityKit for AR and rendering, and SwiftUI for a user interface. Finally, I tagged on MuJoCo, enabling the dynamic. simulation that I’ve been craving. As you can see from the video, it all holds together quite well!
Why the BoE Bot?
I won’t claim to be a historian on the Parallax product line, nor am I using this platform to provide an endorsement; I’m just saying what I know. Somewhere around the year 1999 (Mommy? Why does everybody have the bomb?), Parallax released the BASIC Stamp board, with support for the BASIC programming language, and a variety of mounting kits and electronics to build robots. My father picked up the BoE Bot as a platform to teach Mechatronics at Michigan State University (MSU). On the same timeframe, I was a teenager, and many of my early programming experience came via the BASIC Stamp.

Creating a Realistic Model in Blender
That picture above is one that I bugged my dad for, that I remembered from a poster that was on display at his retirement party from the College of Engineering at MSU. Initially I had the idea that I would take that picture, build a couple blocks and cylinders to rough out its shape, then build that _simple_ representation into my app. But then I said to myself, why not try to go photorealistic?

You can see the final result in the image above, which I will say looks great, but let me tell you, it was a full-blown rabbit-hole to get there. I started off with the assumption that maybe somebody else had already done this, and searched some of the usual suspects, like GrabCAD, SketchFAB, etc, but only found parts that were in Solidworks format, or otherwise were not usable for my own project. Parallax also has a few models available, but only piece-parts, not the whole robot. So I opted to make one of my own.
It became a bit of a painstaking journey of guesswork to construct each of the parts, but fairly enjoyable in the end. The chassis is created from a plane, that I then “bent” and “cut” into the dimensions of the real thing, of which, thankfully, Parallax does provide a dimensional drawing. I used solidify and bevel modifiers to give the chassis its thickness and round the corners, and triangulated it to make sure that the mesh didn’t have any ugly NGons.


The wheels are built using a reference image that I also found on the Parallax website, in this case, not a dimensional drawing, but I eyeballed it to be about a 35mm radius, and went from there. Once more, I used solidify and bevel monitors to turn the outer rim and hub of the wheel from simple circles into solid parts with rounded edges. Then, I used array modifiers and booleans to perform the cuts in the hub.


Finally, the board is where I got real creative. Once again, I found an image on the Parallax, but this time, instead of just using it as a reference image, I actually applied it as a material in the model itself. Of course, just using it as a material meant, even though you could see the parts on the board, the model itself was still flat. My clever idea was to create various extrusions matching the shapes and positions of some of these parts on the board, then configure their UV’s to map just the specific part of the image that they represented from the same board image. From a distance, it looks pretty good!


After all that I export the model in USDZ format, which is required for RealityKit, for the AR app integration. Because I like you, I also went ahead and uploaded the same model to my SketchFab page, so you can download it yourself if you like, or just view it in your browser. GLB format included as well, as I think that’s a bit more aligned with the preferences of the Googles.
RealityKit and SwiftUI
Next up was to take that USDZ file that I generated, and build up the scripts to render it in AR for native-iOS using RealityKit and SwiftUI. The end result looks like the graphics below, which also shows the “virtual” camera mode that is compatible with macOS, or when you just don’t care to use your camera.


The primary function of the scripts are to populate a RealityView, which is latest structure that Apple has provided to render 3D content in AR. The structure itself actually originated with the introduction of VisionOS, and therefore it also targets mixed reality via the Vision Pro headset, but it was ported to iOS last year, and is expected to be the main AR entry point for all Apple systems moving forward. In the code snipped below, you can see that I am basically setting the camera on startup (virtual is the mode on the left, spatial tracking on the right, in the image above), then adding content based on entities generated in the resetScene method of my viewModel. Then, I “subscribe” to scene events, which basically sets up so that on every rendering frame (1/60 of a second-ish), I will recalculate my forces and update the physics model. The update method of the RealityView responds to state changes in the view model; what I basically care about is whether the user has initiated a reset action or changed the camera. Finally, I add the orbiting camera control (only applicable to the virtual mode), and give the view its title (“BB25”).
RealityView { content in
switch viewModel.camera {
case .virtual: content.camera = .virtual
case .spatialTracking: content.camera = .spatialTracking
}
viewModel.resetScene(onComplete: content.add) // Adds the anchor captured in the closure
let _ = content.subscribe(to: SceneEvents.Update.self, on: nil, componentType: nil) { event in
viewModel.applyForces() // Sets the forces at each frame update
viewModel.updatePhysics() // Updates physics simulation based on selected mode
}
} update: { content in
switch viewModel.resetState {
case .requested:
switch viewModel.camera {
case .virtual: content.camera = .virtual
case .spatialTracking: content.camera = .spatialTracking
}
viewModel.resetScene(onComplete: content.add)
if viewModel.physics == .muJoCo {
viewModel.resetSimulation()
}
default: break
}
}
.realityViewCameraControls(.orbit)
.navigationTitle("BB25")The complicated part of the app is implemented in the view model, as well as several extensions to the Entity class, which I have linked, but won’t paste in their entirety here (plus, they’re certain to change). As I mentioned, there is a reset method in the view model, which builds an anchor entity asynchronously. That anchor entity is sent back to the RealityView in a callback once it has been built, and its child entities from the USDZ file are all loaded with their respective physics setup.
Many of the physics and geometric properties associated with those children are managed in the entity extensions. These extension locate the separate, moving parts, meaning the chassis and wheels. A set of revolute joints are then added to each wheel, which in RealityKit terminology involves the creation of “pins” attached to each entity that have location and a specific axis, and then creating joints that constrain pairs of pins together. I also initialize physics simulation components, which for purposes of this model direct the gravity in the downward -Z axis of the model (differing from RealityKit convention of a Y-up), and cranking up the solver position iterations to 128, which works, but as I’ll cover momentarily, not really to my satisfaction.

Finally, I created my own custom JoyStick view, which I use to allow the user to control the forward/reverse and rotational motion of the robot in real-time. This binds to a ControlState variable that is provided from the RealityView, which really is just two numbers that range between -1 and +1. Drag up and down in the joystick and you get forward/reverse control, drag left and right and you get rotational control.
I stylized the joystick with a variety of features:
- A thin material background with rounded corners to depict the edges of the joystick control pad.
- Icons with linear and circular arrows to direct the user where to drag to get their desired motion.
- A circular shape with shadow to indicate the current location of the user’s finger.
- A gradient mask on the background, and state dependent coloring of the icons, to represent the drag state.
All of these combine to produce the translucent control pad you see on the bottom in the videos and images I’ve posted.
MuJoCo
It was always my intention to integrate MuJoCo with an AR app, as I view that as a key element of the ARMOR project, but before I did that I thought I’d try out the built-in RealityKit physics first, with joints and control. As I write this, it works, but from an engineer’s perspective, it leaves a lot to be desired. First of all, just getting things hooked up so that the wheels aligned and located with the chassis was an incredible chore to me, just something about the methods and conventions in RealityKit that didn’t align with my way of thinking. Then, once everything actually was assembled, the revolute joints at the wheels didn’t seem to want to behave like actual revolute joints, they’d cant to the side when the vehicle would turn. As mentioned earlier, cranking the solver iterations up to 128 helped, but there’s a performance hit that comes with it. Ultimately, it’s a fundamental issue in how RealityKit (and really any game engine) implements physics relative how I know they should be; joints, or any constraint are treated as a “best effort” rather than a law. Finally, even though Apple’s documentation claims that you can add a torque to a body during simulation, I never actually was able to get the wheels to rotate from a torque alone, I was only able to control the motion by adding linear forces at the wheel axles, which is physically wrong.
Now, before even attempting to add MuJoCo even to this sample app, first I needed to build the MuJoCo-formatted XML file that defines the model specification. This turns out to be fairly straightforward, and much aligned with my engineering intuition. One of the nice things about MuJoCo is that it has built-in support for a few different 3D file-formats, with .stl being the simplest from my point of view. Here I define a set of colors for visualizing of different parts of the models, and point to the model assets I want to load at run time (just the chassis and wheel).
<asset>
<material name="green" rgba="0.094 0.27 0.23 1"/>
<material name="white" rgba="1 1 1 1"/>
<material name="orange" rgba="1 0.3 0.1 1"/>
<material name="blue" rgba="0.2 0.2 1 1"/>
<material name="black" rgba="0 0 0 1"/>
<mesh name="chassis" file="chassis.stl"/>
<mesh name="wheel" file="wheel.stl"/>
</asset>I can define the chassis to be a body with a free joint, and assign a color, geometry, and mass to it, the latter being the chassis mesh I loaded from .stl .
<!-- Root robot body is the chassis; has a freejoint so it can move in 3‑D -->
<body name="chassis" pos="0 0 0.05">
<freejoint name="base_free"/>
<!-- Chassis geometry -->
<geom name="chassis_geom" type="mesh" mesh="chassis" material="white" mass="1.0"/>
Each of the wheels follow a similar approach, but with position representing their location relative to the chassis, and the euler angles representing the rotation of the joint axis that attaches them to the chassis, in this case you can see a negative rotation of pi/2 radians, which directs this wheel out of the side of the chassis. The axis component of the joint then states that this wheel will rotate about its local +Z axis, after the Euler rotation. We also assign the geometry from the wheel mesh that we loaded.
<!-- Right wheel -->
<body name="right_wheel" pos="-0.04645 -0.0555 0.035" euler="-1.570796327 0 0">
<joint name="right" type="hinge" axis="0 0 1"/>
<geom name="right_wheel_geom" type="mesh" mesh="wheel" material="orange" mass="0.1"/>I also create actuators which get assigned to each of the joints by name. These are set up to be closed-loop velocity actuators, in other words, the control input will specify a target angular velocity that they will rotate at given a constant control input. Gearing, gain, and limits are also assigned.
<actuator>
<velocity name="right_motor" joint="right" gear="0.1" kv="5" ctrllimited="true" ctrlrange="-1 1"/>
<velocity name="left_motor" joint="left" gear="-0.1" kv="5" ctrllimited="true" ctrlrange="-1 1"/>This is, of course, not the complete model definition, as you can find that in the linked repository, but I give these examples to show that the formatting and terminology used in the MuJoCo XML file are intuitive, and aligned with the training we receive as mechanical engineers. This barely scratches the surface, as there are whole slew of mechanical, electrical, and sensing components that MuJoCo can simulate, but its all a great start for starting my robotics simulation.
As far as adding the MuJoCo simulation into my RealityKit app, there happens to already be a community-supported GitHub repository that does just that. I did have to create my own fork, which disabled a filename variable which was causing a compiler type-checking error, but other than that, I found pretty simple to build right into my BB25 code. First of all, I took my chassis and wheel entities, and made them kinematic rather than dynamic, which basically disabled them in the built-in physics engine. In fact, I never initialize the built-in physics at all when using MuJoCo, saving on all the solver iterations I mentioned before.
Instead, in the reset method of the view model, I use the swift-mujoco code to load up a model from the XML file.
model = try MjModel(fromXMLPath: filepath)Then, create a data object to track the generalized coordinate states during simulation.
data = model?.makeData()Any time the RealityView receives a scene update event (again, every 1/60 of a second), I can step the simulation, and write the updated states into my data structure.
model.step(data: &data)The step increment in the line above is actually 0.002 seconds, which is a parameter setting in the XML file, so I actually call it enough times to get the simulation time synched up with the amount of real time that has passed. The data structure contains all of the generalized coordinate states that I need to update the transforms that are applied to the chassis and each wheel at each frame.
Comparison of MuJoCo to Built-in Physics, and Other Thoughts
I left a toggle control in the app so I could switch between the two simulator options, just so I could compare. What I find is that, to an untrained observer, they probably look pretty much the same. Both move at about the same pace, handle contact similarly, and make for a fun little robot game. Since I built the app, I can see the subtle differences that make the build-up of MuJoCo worth the time.
- The initial drop in the RealityKit mode leads to a bit of an over-emphasized bounce. I suspect that this may be due to some overpenetration, combined with somewhat small mass scales of the wheels, they probably get flung overly far due to the collision before the constraint solver corrects itself.
- The RealityKit version looks a bit wobbly as it moves around. This is the same as I mentioned with the revolute joint not being strictly enforced. When you drive around a bit, the wheels will start to tilt a little bit any time you turn, before correcting themselves when you move in a straight line.
On the latter note, you’d be shocked at how long it took me to get the revolute joints to even hold together at all, and had to create a custom cylinder shape to actually make their contact behavior resemble wheels. I’m a stickler for accuracy, so even getting to this point was sort of a win, but still not how I like it.
The MuJoCo model, if you pay attention, is much smoother, and for good reason, all of the surfaces are nice and smooth, and if they constraint laws are obeyed, the wheels can *only* rotate about their axis. This is closely tied with the fact that it uses a generalized coordinates convention, which to me implies that the simulation authors think like dynamicists. MuJoCo also runs an authentic closed-loop control simulation for the wheel velocities as well, which from my perspective, means that if this were a simulation of real importance, I could prescribe the motion of the wheels *exactly* as I need them to be, rather than the approximation and guesswork from the RealityKit engine.
In the end, I’m still happy with what I got out of both, and don’t intend to criticize the RealityKit physics engine too much, as it serves a certain purpose quite well; that is, gaming and visual arts. Adding MuJoCo as a layer on top just provides that extra degree of precision that I demand as a skilled engineer.
So which of the two was easier, might you ask? Surprisingly, MuJoCo by a long shot. It took me a little while to figure out the compiler issue I alluded to earlier, but once I got to the actual “writing code” part, I think I got it connected and working in an hour or two. In comparison, I struggled with getting RealityKit joints hooked up just as I wanted them for days. That is to say, if you want realistic physics in an iPhone app (or Unity, for that matter, on any platform), MuJoCo is certainly worth a try.
As always, if you are interested on collaborating on a mobile app development project, especially in the fields of augmented reality or mechanical engineering, shoot me an email at eliott.radcliffe@dc-engineer.com, or add me on LinkedIn.
