Centerpiece
The Centerpiece Housing is the primary motion and working medium delivery component of the system. Various concepts were evaluated through trade studies to ensure fulfilment of all the system level requirements.
The four arm linkage was the optimum configuration for applications on a rectangular vertical surface to maintain positional accuracy of the centrepiece housing within the allotted target area. To avoid force couplings being induced on the statically intermediate structure the load paths would be able to rotate and the angle between adjacent arms changed for each unique location.
The centerpiece was designed with a disk shape.The disk’s symmetry assists in the balancing different cable tensions. Cable linkages would utilize a swaged cable end locked into the centerpiece housing using ball lock pins for easy user setup and breakdown. The disc shape geometry also allows for scaling up or down dimension for further customization or model variations.
On the back of the housing would be the entry point for paint and air supply to the nozzle routed from the lower paint reservoir.
It also includes a shield at the back to better contain the paint when sprayed on the surface.
Robot Control
The main difficulty we encountered was determining how to control the robot. Due to my interest and background in math, I was tasked with developing the algorithm that would generate the robot’s path.
Initially I wanted to focus on a simple case where the robot would be used to draw a logo on a wall, for example. I saw a mural of the San Jose Sharks in downtown, and tried to use the logo as an example. Fortunately, the logo has well defined lines which made the edge detection process slightly easier.
Example project. The robot would draw the right image, the artist would then fill in the details.
For this project I focused on the following three areas:
- Path Generation
- Postprocessing
- Motor control
The first challenge was specifying the robot’s path. We needed a system that could track the robot’s current position and from there calculate a what motors needed to be on, at what speed, and for how long. The first step was to draw an outline that the motor could trace. I took what I had learned from a class on signal processing and applied it to this problem. I ended up using mathematical tool called wavelet transforms that could apply filters to an image in order to extract and manipulate the image details.
The High (H) and Low (L) filters are applied in the vertical direction, and again in the horizontal direction. In practice the HH extracts changes in the diagonal direction, HL in the x direction, LH in the y direction, and LL is what is left after extraction. I then repeat the process with the LL matrix. Adding all the matrices together, along with some manipulation and some thresholding, should result in the outlines of the original image.
The images above represent the result of applying the H and L filters.
- The Bottom Right quadrant is the HH and it extracts changes in the diagonal direction.
- The Bottom Left is the LH which extracts vertical changes.
- The Top Right is the HL filter, extracting horizontal changes.
- Finally, the Top Left is the LL and is simply the result after the other features have been removed.
On the image on the right I demonstrate how different threshold levels affect the final result. Most free edge detection algorithms do not have this feature, only the paid versions do. One of the great properties of wavelet transforms is the concept of Multiresolution Analysis. Basically this allows me to perform the same procedure on the resultant image in the top left. This allowed me to filter the image a second time and extract some features that were missed the first time.
At this point we had simply a set of coordinates where the robot had to travel, what we did not gave is the optimal path. At this point I found an open source software that can trace a path and convert it to a vector file. The sharks logo is great for edge detection but it has many lines and would be complicated to test, so I decided to start with the logo from my alma mater, the University of Arizona. This should be easy to extract and relatively easy to draw. I generated a SVG file using inkscape. The SVG format contains the code for a path that the computer extracts and can draw at any resolution. I had to built a program that would extract the path in a way that could be useful to our project.
From left to right: Original Image, Extracted edges, partial SVG code.
At this point I should mention there are countless programs, both open source and proprietary, that do what I just did and much better. However, these are designed for use in CNC machines with a single x-motor and a single y-motor. Our design will have four motors which all control both x and y movement. All this means we cannot use that software as is.
SVG files is similar to gcode in a way, it specifies coordinates and instructions where to move. The difference is that gcode treats all curves as circles of varying radius and arc length. SVG on the other hand, all curves are Bezier code, this results in much shorter code vs gcode.
In the code in the top right picture you can see three letters “M”, “L”, and “C”. These are the instructions on how and where to move, this is our path. Occasionally you can also encounter “Z”, but this does not affect our robot. Now we have the path that the robot should follow. The next step is to modify it to a format that can control the robot. In order to do this we must first determine what the letters and numbers mean.
First, the numbers are simply coordinates with the origin being the top left corner.
“M” simply tells the machine to move to a specific point without drawing anything. This represents the end of a particular path and moves to the start of another line.
“L” tells the machine to draw a straight line between each set of points that follows the “L” until it encounters another letter.
“C” refers to a Bezier curve, this is where I ran into the most difficulty. In order to draw a curve you need a set of start and end points as well as one or more control points. Wikipedia has a much better explanation. Below are animations of quadratic and cubic curves. The “C” path consists of both cases.
There is a complexity arises due to the fact that sometimes the curve is quadratic and other times it is cubic. Basically, the beginning of the curve is always quadratic with the subsequent points being cubic, however the third point is reused as the first point in the new curve with three new control points. The final third new point is re used as the first point in the next curve, and so on. The end of the curve is usually quadratic.
Below is my first attempt at interpreting the SVG file and converting it to something that can control the robot. While developing the program in python I wanted to see if I could test the program under different conditions to see if it really works. The first thing I did was to delete some points in the middle and added an M in order to have it move to the next point in the middle of the path. If you do this directly in SVG you can get some strange results like curves where there should be none (trust me I tried). I also had the program end in the middle of a line. Again this can produce weird results in SVG while it tries to complete a curve when it expects and end code and there is none. The results are below. Remember, the path is incomplete on for testing purposes.