Otvinta.com Home
Home  •  Tutorials  •  Calculators  •  3D Models  •  About  •  Contact
3D Model 12: 3D-Printable Rubik's Cube Solving Robot

1. General Information
2. Introduction Video
3. Licensing
4. List of 3D-Printed Parts
5. Hardware Shopping List
6. Assembly Instructions
7. Connecting Electronics
8. Servo Calibration with Pololu Maestro Control Center
9. Software Installation and Configuration
      9.1 Introduction
      9.2 Installation on Raspberry PI
      9.3 Installation on a Regular Windows 10 PC
      9.4 Configuring and Running the Application
      9.5 Troubleshooting
10. Frequently Asked Questions

1. General Information

This 3D-printed Raspberry Pi-powered Rubik's Cube solving robot has everything any serious robot does -- arms, servos, gears, vision, artificial intelligence and a task to complete. If you want to introduce robotics to your kids or your students, this is the perfect machine for it.

This one-eyed four-armed giant stands 35cm (14") tall. 70 hours of print time and an entire spool of filament are needed to print it, not to mention over $200 worth of hardware, but once fully operational, it will surely wow your friends and neighbors. Scramble your Rubik's cube, place it in the robotís grippers, press a button, and then sit back and watch this amazingly smart and stunningly beautiful machine put it back together. Watch the video below!

This robot is fully 3D-printable. Other that the servos, servo horns, camera, electronics and a few dozen bolts and nuts, it has no traditionally manufactured parts. Absolutely no soldering or breadboarding is required.

Click to download these .STL files

Download Link
Size:2.66 MB
Last Updated:2017-09-19

2. Introduction Video

3. Licensing

This product is distributed under the following license:

Creative Commons - Attribution - Non-Commercial - No Derivatives

For educational and commercial licensing, please contact us.

4. List of 3D-Printed Parts

ItemQuantityPrint Time (min.)Total Time (min.)Filament (gr.)Total Filament(gr.)

Total Print Time: 4,028 min. (67 hours 08 min.)
Total Filament Required: 882 gr.


  • If your camera is wide-lens and the cube appears too far on the photos, print 4 of rcr_rod_short.stl and 4 of rcr_rod.stl instead of 8 of rcr_rod.stl.

5. Hardware Shopping List

QuantityItemPrice Per Item (approx.)
4 DS3218 Servo Motor with Horn
4 150 mm Servo Extension Lead, Male-to-Female
4 Hitec HS-311 Servo Motor
1 Raspberry Pi 3 Model B Quad-Core (optional)

This part is optional. The app can be run on a regular Windows 10 PC as well.
1 Pololu Mini Maestro 12-Channel USB Servo Controller (Assembled)
1 USB 5 MP or 12 MP Webcam with 6 LEDs

Search on eBay, Amazon, etc. for Webcam with LEDs. The camera we use was sold under the brand "HDE" but often no brand at all is mentioned. Just look for this distinctive shape. It could be bought for as low as $3.00 - $5.00.

Do NOT buy the TechNet brand, it does not seem to work with our app.

1 6V, 3A (3000 mA) power source, wall-plugged or rechargeable

Our robot uses the SMAKN power supply adapter (shown here), with the round plug replaced by two female connectors to be plugged into the Pololu servo controller. The replacing was done by a competent technician proficient in soldering. Use a wall plug at your own risk. See the Connecting Electronics section below for more information.
1 Standard-Size Rubik's Cube

Without this item, the robot is completely useless. We recommend the stickerless, smooth-operation variety. Our color recognition code was only tested with the standard (original) colors shown here. Please do NOT use a speed cube!
76 Metric M3-12 Phillips-head Countersunk Bolts $0.06
36 Metric M3 Nuts $0.06
9 Small 2mm Wood screws or Metric M2x8 Molts
These are to attach the HS-311 horns to the servos, and one more to attach the Pololu servo controller to the back side of the camera holder.

Total Cost of Hardware (approx.): $200.00.

6. Assembly Instructions

Step 1:

Attach the round horn that comes with the HS-311 servo to pinion with two small 2mm wood screws or two metric M2x8 bolts.

Step 2:

Insert the single-armed horn that comes with the DS3218 servo into gripper. Secure with two metric bolts. Screw in the bolt closer to the center first.

Step 3:

Insert the DS3218 servo into slider. Secure with 4 metric bolts and nuts.

Step 4:

Insert rack into slider, have the servo cable run in the triangular recess in the bottom of the rack. Align holes. Secure with 6 metric bolts.

Step 5:

Insert the HS-311 servo into arm. The servo's shaft must be aligned with the round hole on the other side of the arm. Secure with 4 metric bolts and nuts.

Step 6:

To secure the slider in place, install the pinion onto the HS-311 servo's shaft and secure it with an axis bolt that came with the servo. Note that during the calibration phase the pinion may need to be removed, slider adjusted, and pinion replaced.
Repeat Steps 1 to 6 to assembly three other arms.

Step 7:

Using the 8 corners, assemble the 4 arms into a single unit.

Step 8:

Set the assembly obtained in Step 7 onto the two legs, align holes. Insert the pair of long_bolts in the bottom holes, and short_bolts into the top holes. Secure all 4 bolts with nuts. The heads of the bolts must be on the same side as the HS-311 servos (far side on the picture below), while the nuts on the opposite side (near side on the picture below.)

Step 9:

Screw four rods into the heads of the long and short bolts tightly. If your camera is not 5 MP but 12 MP, we recommend using rcr_rod_short.stl instead of rcr_rod.stl to bring the camera closer to the cube.

Step 10:

Screw four other rods (rcr_rod.stl) into the slots of camera_holder tightly.

Step 11:

Position the camera holder in such a way that the ends of the rods attached to it are in close proximity to the ends of the rods attached to the main unit. The slit in the camera holder must point downwards. Using the clamp_halves, connect the 4 pairs of rod ends. Secure the clamp halves with the metric bolts and nuts.

Step 12:

Remove the stand and semi-circular ring from the webcam using a small screwdriver. Insert the webcam into the niche in the camera holder. Run the webcam cable through the slit in the camera holder. Secure the webcam with camera_cover. Plug the webcam into Raspbery PI's USB port. Install the grippers onto the DS3218 servos, secure with axis bolts that came with the servos. Do not tighten the servo horn clamps just yet as the positions of the grippers may need to be adjusted during the calibration phase.

7. Connecting Electronics

Thanks to the Pololu Mini Maestro servo controller, there is absolutely no need for PCBs or breadboarding. You connect the 8 servos to the Maestro, and the Maestro to your PC via a USB cable for calibration (and later to the Raspberry PI for the actual cube solving.)

The servos can be connected to any of the 12 channels of the Mini Maestro arbitrarily. The image below shows the channel assignment used by our robot. A white number in a red circle next to a servo denotes the Maestro channel number for this servo. Even channels are used for the gripper servos, and odd channels for the rack-and-pinion servos. Channels 4 and 5 are skipped for spacing.

For the power supply for the Maestro, you have a choice between a rechargeable 6V battery pack and a modified 6V, 3A (3000 mA) wall charger. The 1st option is safe but servos are power-hungry, and the battery pack drains quickly. Make sure you buy a high-capacity pack.

The 2nd option requires that the charger's standard round connector be removed and replaced, or extended, by two wires ending in the standard female connectors. The work has to be completed by a competent technician proficient at soldering. The charger must be rated at 3A or higher. Our robot is powered by this power supply adapter extended as shown below. Use this option at your own risk.

The servo controller should be attached to the back of the camera holder using a single small wood screw. The image below shows the controller in its working position, with all the servo cables and power wires attached to it. The green and yellow wires in the lower-right corner of the image are power wires.

8. Servo Calibration with Pololu Maestro Control Center

The purpose of the servo calibration is to find two key target settings for each servo's channel. For the gripper servos, the two target signals are for the neutral position and the 90° position. For the rack-and-pinion servos, the two target signals are for the "near" position (hugging the cube) and the "far" position (releasing the cube.) These numbers are determined experimentally using the Maestro Control Center software available on the Pololu web site and installed on your PC.

After firing up the Maestro Control Center, select the controller from the "Connected to" drop-down box. Go to the Serial Settings tab and select USB Dual Port for the serial mode. Then press Apply Settings.

Then return to the main Status tab to calibrate your servos.

Begin by putting the sliders in the "far" position in which the front of the slider is flush with the arm, and put the grippers in the "neutral" position, as shown on the image below. Write down the "far" target values of the rack-and-pinion servos, and "neutral" target values for the gripper servos.

Next, determine the target value for a 90° rotation for each gripper servo. On the image below, the right arm's gripper servo has been turned 90° relative to its neutral position. Do this for each gripper servo, and write down the target values for all four.

It is absolutely critical that all the gripper servos move from their neutral to 90° positions in the directions marked by red arrows on the image below:

Finally, put the gripper servos back in the neutral position, and insert a Rubik's cube in the bottom gripper. Determine the "near" positions of all four rack-and-pinion servos in which the cube is tightly hugged and centered, as shown on the image below. Write down these target values.

Once acceptable target values for all servos have been determined, they need to be transferred to our application via its own user interface. The values can later be adjusted, if necessary.

Note that during calibration, the gripper and slider positions may need to be adjusted to allow a proper movement range. To adjust the position of a gripper, it needs to be removed from the gripper servo's shaft and then re-attached in a different position. Once the final position is found, the servo horn's clamp needs to be tightened with an Allen hex key. To adjust the slider's position, the pinion needs to be removed, the slider shifted as necesary, and then the pinion re-attached.

For more information on using the Maestro servo controller and its Control Center software, please refer to the Pololu web site.

9. Software Installation and Configuration

UPDATE November 21, 2017
Version 2.0 of the RubiksCubeRobot application is now in beta and available for public testing. The new version works better and has tons of cool new features. For more info, click here.

9.1 Introduction

The robot is driven by a Universal Windows Platform (UWP) application of our own creation called RubiksCubeRobot. By sending control signals to the robot's 8 servos and webcam, the application photographs the cube, performs the image and color recognition on the photos, determines the initial position of the cube, computes the sequence of rotations necessary to solve the cube, and then executes the sequence. Please download the application via the links below.

9.2 Installation on Raspberry PI

Download Link for Raspberry PI
Size:12.9 MB
Last Updated:2017-10-23

The following instructions assume that you have already downloaded and installed Windows 10 IoT Core on your local PC, installed the Windows 10 IoT Core operating system on your Class-10 Micro SD card, booted your Raspberry PI from it, and connected the PI to your local network via WiFi and/or an Ethernet cable. Your Raspberry PI device should be showing in the My devices list of the IoT Dashboard:

To install the RubiksCubeRobot onto your PI, please follow these easy steps:

  • Download the .zip archive for RubiksCubeRobot from the link above. Unzip it to a temporary directory of your PC's hard drive, such as c:\tmp.
  • Select Open in Device Portal from the IoT Dashboard. In the Windows Device Portal, go to Apps, Apps manager.
  • Under Install app, for App package, select the file with the extension .appx in the temporary directory, and for Certificate, select the file with the extension .cer.
  • Click on Add dependency three times. For the three Dependency boxes, select the three files in the \Dependencies\ARM subfolder of the temporary directory.
  • Click on Go under Deploy.

That's it! RubiksCubeRobot should now appear under Apps. You can start the application by choosing "Start" in the Actions drop-down box, and mark it as startup by clicking on the radio button in the Startup column.

9.3 Installation on a Regular Windows 10 PC

Download the x86 or x64 versions from the links below:

Download Links for Windows 10 PC
Size:12.9 MB
Last Updated:2017-10-23

Size:30.0 MB
Last Updated:2017-10-23

Unzip the content of the download to a temporary directory such as "c:\temp". Prior to installing the app on your PC, you need to install the certificate in the Trusted Root Certification Authorities of both the Current User and Local Machine sections of the certificate store. This only needs to be done once.

Double-click on the .cer file in the temporary directory, click Install Certificate, select Current User, then select "Place all certificates in the following store", and select the Trusted Root Certificate Authorities folder. Repeat this procedure but this time select Local Machine instead of Current User.

Once the certificate is installed, double-click on the .appx file in the temporary directory to install the app on your PC.

9.4 Configuring and Running the Application

When you run the appliction for the first time the following screen comes up:

The blue and red buttons of the main page perform the following functions:

  • calibration -- takes you the Calibration Center where the servo target values are entered.
  • configuration -- takes you to the Configuration Center where the parameters responsible for image and color recognition can be viewed, and changed if necessary.
  • training -- performs color training by taking photos of a fully assembled cube.
  • analysis -- performs image and color recognition of the most recent set of photographs stored on the device's hard drive. This function is only used for debugging purposes.
  • key -- allows you to enter your registration key, and also activate your paid-for key permanently on the device.

  • OPEN -- brings the rack-and-pinion servos to the full-back position and gripper servos to the neutral position so that the cube can be inserted.
  • RUN -- starts the work sequence after the cube has been inserted.
  • STOP -- performs an emergency stop.
  • OFF -- switches all servos off.

The 1st required step is to click the key button and enter your registration key in the form XXXXX-XXXXX. Please contact us to obtain your free 30-day evaluation key. During evaluation, the application performs run-time key validation over the Internet, so your Raspberry PI (or PC) must be connected to the Internet for the application to function. As of Version, your paid-for license key can be activated on the device permanently, and once that is done, the application no longer needs an Internet connection.

The 2nd required step is to enter all the servo target values in the Calibration Center:

The screenshot above shows the settings for our robot, but you must obtain your own numbers during calibration using the Pololu Maestro Control Center software.

Our image and color recognition algorithm uses a number of parameters to convert the color photographs to dual-tone images for line detection, and to determine the colors of the cube squares. These parameters can be modified in the Configuration Center: If your are running the application on a PC and it has another camera connected to it (laptops almost always do) you need to select the robot's webcam via the Camera Name drop-down box.

The parameter "White if Saturation below" was added in Version to address situations where the white cubies are mistaken for other colors, such as blue, red or orange. The default value of 0 has no effect. If the value is set to a number between 0 and 1, such as 0.2, a cubie is identified as white when its saturation is below 0.2, regardless of the hue. Decrease this parameter or set it to 0 if non-whites are mistaken for whites.

As of version, a new parameter box, Miscellaneous, has been added to enter various other parameters. One or more codewords can optionally be entered in this box. Currently the only supported codeword is DESKEW which instructs the app to perform image de-skewing to avoid "no lines found" errors. Future versions will support more codewords.

Changing all other parameters manually is generally not recommended. Refer to Section 9.5 below or contact us if the robot consistently reports errors such as ERROR_NOLINESFOUND, ERROR_TOOMANY_MISIDENTS, or similar.

Before clicking the RUN button, make sure all 8 servos are plugged into the Maestro servo controller, both the Maestro and webcam are plugged into the Raspberry PI's (or your PC's) USB ports, and the power source is connected to the Maestro.

We recommend that the first test run be performed without a Rubik's cube. The robot will perform the necessary manipulations of the arms and grippers, and take 12 photographs that are going to be displayed immediately. If the cube is not inserted, the image processing phase will, understandably, fail. An error will be displayed, and the arms and grippers returned to the open position.

Once the first test run is finished, insert the cube in the grippers and proceed with a live test run. Make sure the cube on the photographs is in focus, and roughly centered. Adjust the camera's focus and direction of the lens, if necessary.

To improve color recognition, the robot can be run in the color training mode to compute color hues reflecting your own unique lighting conditions. Insert a fully assembled cube with the white face facing the camera and red face pointing upwards, and press the train button. The cube will be photographed and new color hues calculated based on those photographs. You will then be asked if you want to save these new hue values. You can always return to the "factory" settings by clicking the Restore Defaults button.

9.5 Troubleshooting

You have printed the parts, ordered the hardware, put the robot together and run it. The robot has photographed the cube dutifully, but instead of putting it back together as it should, it is displaying an error such as ERROR_TOOMANY_MISIDENTS, or ERROR_NOLINESFOUND, or similar.

Do not despair! Most of these errors can be fixed by adjusting the camera position, calibration data, configuration parameters, or all of the above. But you don't want to fly blind. To know exactly what to adjust, you need to see what the robot is seeing, and understand how it processes the input data.

We have created an online application which gives you a glimpse of the robot's thinking process, and appropriately called it "Through the Eye of the Robot", or simply the Eye. It is available at the following URL:


To use the Eye, enable debugging in the Configuration Center, insert the cube and press the RUN button. In the debug mode, 13 files with the same numeric prefix will be created in the \Data\Users\DefaultAccount\Pictures folder on Raspberry PI (and in This PC\Pictures on a regular PC.) For example:


The .pix files contain the pixel data of the 12 photographs (2 per face) the robot has taken during the photographing phase. The .txt file contains the current configuration parameters.

The Eye's upload form allows you to upload these files to our server for processing. The Eye performs the same color recognition procedures on the uploaded files as the robot would do, and produces a PDF file summarizing its thought process.

Let's look at a typical page from a PDF document produced by the Eye:

At the top, there are two photographs of a cube face taken by the robot. The identified cubie zones are marked with gray dotted rectangles. In the middle, the dual-tone edge maps of the photos are shown, which are used to identify the cubie zones. At the bottom, a diagram on the right displays the cubie colors obtained from the photographs along with their hue and saturation values. A diagram on the left displays the final result of the color recognition process for this cube face.

This information gives you valuable insight into the robot's "brain" and helps you make necessary adjustments to get the robot fully operational. The following sub-sections describe some of the problems you may encounter, and the troubleshooting guidelines.

9.5.1 Photographs not Centered

If the camera is not properly mounted or not pointing at the middle of the cube, a photo may look like this:

The image is badly off-center. If the robot can't see the entire face of the cube, it can't process it, and the error ERROR_NOLINESFOUND is likely to ensue. Make sure the central cubie is roughly in the middle of the photograph. Adjust the camera's position in the camera holder if necessary. Also make sure the grippers hug the cube tightly to avoid shifting while the cube is being photographed. Adjust servo calibration values if necessary.

9.5.2 Camera not in Focus

If the camera is badly out of focus, as on the picture below, the blurriness of the photo may cause the app's line detection subroutine to fail, resulting in a "no lines found" error.

The camera model we use is equipped with a manual-focus lens. Rotate the lens until the photos taken by the camera are sharp.

9.5.3 Excessive Adaptive Threshold Parameter

To detect individual cubies on a photograph, the robot computes the "edge map" of the image based on the Adaptive Threshold parameter (60 by default.) If lighting is insufficient, the edge map may not have enough white pixels for zone detection:

The edge map shown here will likely generate the errors ERROR_NOLINESFOUND, ERROR_NOHORIZLINESFOUND or ERROR_NOVERTLINESFOUND.

To get more white pixels to appear on the edge maps, reduce the Adapative Threshold value. The following edge map was generated with the Adapative Threshold set to 20:

As a result, zone detection has worked well:

9.5.4 Skewed Images

Due to an incorrect camera position, or for some other reason, an image may appear tilted, resulting in a "no lines found" error:

As of version, the application supports automatic deskewing. To enable it, enter the codeword DESKEW (case is immaterial) in the Miscellaneous box on the Configuration Center page of the app. As a result, the photographs are deskewed, and the zones are identified correctly:

9.5.5 Non-White Color Mis-Identification

Except for white colors, the app determines a cubie's color by comparing the median hue of the cubie zone with the five color hues specified in the Configuration Center and finding the closest one. Red, orange, yellow and green colors are very close to each other on the "hue ring" and, as a result, the app may mix them up. The hue/saturation diagram, displayed in the bottom-right corner of every page of the PDF document, can help you configure the hue values to match your particular lighting conditions and camera sensor.

Consider the following pair of photographs:

With the default color hue values, the app mistakes green for yellow, and yellow for orange:

However, using the hue/saturation diagram shown above on the right, it is clear that the Green Hue value should be changed from 180 to about 110, and Orange Hue from 60 to about 40. Indeed, these changes fix the mis-identification problem:

9.5.6 White Color Mis-Identification

The white color is different from the other 5 colors in that it has no hue value of its own and cannot be identified the same way as the others. What makes the white color different from the others is a very low saturation value. The pure white color has a saturation value of 0. The white cubies on the photographs do have non-zero saturation values, but they are usually still much lower than that of the colored cubies. To take advantage of this property of the white color, set the "White if Saturation below" parameter to a low non-zero value, such as 0.2.

The following pair of photographs demonstrates the effect of this parameter:

The default "White if Saturation below" value of 0 causes the white colors to be mistaken for blue and red:

However, setting this value to 0.2 fixes the errors:

NOTE: Be careful not to set this parameter too high, or non-white cubies will be mis-identified as white.

In some cases, the white cubies on the photographs are so far from the pure white that they have saturation comparable with that of non-white colors. If that is the case, the "White if Saturation below" parameter alone is not sufficient. You need to specify which of the 5 standard colors the white color is similar to, hue-wise. That is what the "White is like" parameter is for. In many cases, white is similar to blue, as in the photos above. When a cubie's hue is identified as blue and the "White is like" parameter is set to blue also, the decision as to whether to identify this cubie as blue or white is based on the saturation value. If the saturation is below the "White Saturation Threshold" parameter, the color is identified as white, otherwise as blue.

In the following example, the white color is similar to orange:

The default parameters cause the white cubies to be mis-identified as orange:

According to the hue/saturation diagram, the hue of the white cubies is around 18, which is close to the orange hue. Therefore, setting the "White is Like" parameter to "Orange", while keeping the "White Saturation Threshold" well above 0.44 fixes the error.

9.5.7 The Colors are Identified Correctly but the Cube is Not Being Resolved

It is critical that your robot photograph the cube's faces in a particular order. Please refer to our introduction video for the correct sequence or rotations during photographing. For example, if you insert the cube with the white center square pointing towards the camera and red center square pointing upwards, then the correct order in which the center squares should appear on the photographs is:

White -> Red -> Yellow ->Orange -> Blue -> Green

If some or all of the gripper servos are calibrated incorrectly and turn from the neutral to 90° position in the wrong direction, the app won't be able to correctly reconstruct the cube's initial position and therefore, won't be able to resolve it. The app may consistently report the error ERROR_CENTERSQUARE_MISIDENT, even though all the colors are correctly identified. The app may also hang while in the "Solving..." mode, or execute a sequence of turns but at the end the cube is still not resolved. This can also happen if the camera is mounted upside down or sideways.

9.5.8 Grippers are Retracted in the Wrong Order

The app relies not only on a particular order in which the cube faces are photographed, but also on a particular order in which the grippers are retracted. The first photograph of a pair must have the top and bottom grippers retracted, and the second photograph - the left and right ones. If the retraction order is reversed due to an incorrect channel assignment of the servos, the app won't be able to correctly identify the colors of the side cubies as the grippers will get in the way.

On the image above, the pink grippers retracting in the wrong order wreak havoc on the color recognition of the side cubies:

9.5.9 Still Having Problems?

Please do not hesitate to contact us if you continue having mis-identification or any other problems. We will do our best to help you resolve them.

10. Frequently Asked Questions

  • Is Raspberry PI required?
    No, any device with two USB ports can be used (one for the Maestro, the other for the webcam.) The software is currently available for Raspberry PI running Windows 10 IoT, as well as regular Windows 10 PCs.
  • Why was this particular webcam model chosen?
    This webcam was chosen because it produces small photos (640x480) which results in faster processing, is equipped with LED lights, nicely shaped, and inexpensive.
  • Why use a separate servo controller, why not control the servos directly with the Raspberry PI?
    Initially we tried to do just that, but have failed to achieve satisfactory results even with 4 servos. Breadboarding with 8 servos does not seem to be reliable enough. However, we admit it may still be doable.
  • Why does the robot take two pictures per cube face instead of just one?
    The grippers, when engaged, cover a significant portion of the cube's face, which makes it difficult to accurately identify the colors of the side cubies. As a workaround, the robot photographs each face first with the vertical grippers retracted, and then the horizontal ones.
  • Why does the robot make three clockwise 90° turns instead of a single counterclockwise 90° turn?
    Because of a slack between the gripper and cube, it takes a greater-than-90° turn of the gripper to perform a 90° turn of the cube's face. Therefore, a 180° servo such as the one we are using won't do both a 90° and -90° turns with proper precision. 270° servos would probably work better, and future versions of our app may support those.

Related: How to Model a Rack-and-Pinion in Blender