From Deep Blue to Deep Trouble?

On 11th May 1997, Deep Blue, a chess-playing computer developed by IBM, beat the then world champion Gary Kasparov at chess.  It won the six game match with two wins and three draws. Kasparov accused IBM of cheating and demanded a rematch, but IBM refused and dismantled Deep Blue. Kasparov had beaten a previous version of Deep Blue in 1996.

By 2011, IBM had built an even more intelligent computer called Watson – named after IBM founder Thomas Watson. Watson is an artificial intelligence computer system capable of answering questions posed in natural language.

As a test of its abilities, Watson competed on the TV quiz show Jeopardy, in the shows only human-versus-machine match to date.

 

Watson managed to beat Brad Rutter and Ken Jennings, the biggest all-time money winner and longest championship streak record holder respectively. Behind the scenes, Watson had access to 200 million pages of structured and unstructured data, consuming four terabytes of disk storage, including the full text of Wikipedia. However, it was not connected to the Internet during the game.

Today, Watson is marketed as a tool for people to explore and use. Watson is not alone, Microsoft have launched Azure ML, their machine learning platform, and everyday new companies are opening for business, promising to provide the answers to humanities toughest problems.

Computer scientists like Geoffrey Hinton, Yann Lecun and Andrew Ng are leading the way with improved machine learning techniques that have recently led to great advances in deep learning systems.

Software advances are being matched in hardware by the unstoppable Moore's law, which is the observation that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years. The law is named after Intel co-founder Gordon Moore, who described the trend in his 1965 paper. His prediction has proven to be most accurate - in part because the law is now used in the semiconductor industry to guide long-term planning and to set targets for research and development. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at exponential rates as well.

Where will it all end? Each stage of technical development and each computerised victory brings us inevitability closer to the day that machines will outsmart humans…

There are those who call themselves Singularitarians who believe that the creation of a super intelligence, the Singularity, will happen in the near future and that deliberate action ought to be taken to ensure that this intelligence benefits humans. Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the Singularity is not only possible, but desirable if guided prudently.

On the flip side, there are some prominent figures, including Elon Musk and Stephen Hawking, who warn against major advances in artificial intelligence. In a recent interview with the BBC Hawking stated:

“The primitive forms of artificial intelligence we already have, have proved very useful. But I think the development of full artificial intelligence could spell the end of the human race. Once humans develop artificial intelligence it would take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded.”

I’m certainly not up there on the intelligence scales with Stephen Hawking, but I do have a view. We are undoubtedly developing computers that are becoming more intelligent. The problems these computers solve are very useful: self-driving cars and speech recognition – where would I be without Siri?!

However, these computers are in no way sentient – they are merely very good at recognising patterns - they have no personal goals or desires. Animals made this jump with the evolution of the neocortex. In many ways this is what allows mammals to learn new behaviours and for humans to develop conscious thought and language.

To match a human level intelligence, with goals and desires, we must make monumentous advances in learning algorithms and develop fundamentally new approaches. We must learn to create the equivalent of a neocortex that sits over lower level learning algorithms.

That’s not to say we won’t get there one day – I’m certain we will - but we’re a long way from that just yet and have plenty of time to think about necessary safety concerns.

I, for one, welcome our new machine overlords..!


From Neural Networks to Deep Learning

A few years ago, I began blogging about Neural Networks. I have had an interest in this side of machine learning for more time than I can remember. However, even though these amazingly useful constructs have been used to solve many real world problems; they have never really delivered on the dream of a true artificial intelligence – until now. With the advent of Deep Learning algorithms this is all about to change…

Neural Networks began as single layer networks that could be used to solve “linearly separable” classification problems. This type of network was known as the perceptron.

Some very bright people then discovered how to do “back propagation”, which allowed (in theory) multi-layer networks to solve any type of classification problem. The back propagation algorithm is so called, because of the way it works - it compares the output of a network with the desired value and feeds back tiny amounts of the error through the network to modify the weights.

If you wanted to do something useful with a Neural Network, such as perform pattern recognition – identifying images that contain a car - you start by converting raw pixel inputs into feature activations. These feature activations are often hand crafted and are designed to pick out something like an individual wheel or grill. The network would then learn how to weight the feature activations and decide what it’s seeing in the image.

However, using back propagation to solve these problems really did not work for a number of reasons:

  • It’s really hard to hand craft feature detectors
  • It requires pre-classified (labelled) training data - almost all real world data is unlabelled
  • The learning time does not scale well - especially with really large networks
  • The network can often get stuck in “local optima” – it will stop learning before arriving at the correct solution

These serious limitations rendered neural networks as nothing more than a computer scientists plaything for several decades.

But then, with the passage of time, the story slowly changed. The rise of the Internet and Big Data brought with it huge amounts of labelled data. Computers also got a lot faster, especially with the creation of Graphics Processing Units (GPU) - by orders of magnitude. And, most importantly, we learnt new and better techniques to initialise the networks.

The key difference between techniques used in modern deep learning algorithms and the neural networks of old, is that the network creates its own feature detectors – they are not hand crafted. Therefore, the only limitation is computing power – and we have plenty of that!

Deep networks learn one layer at a time, using a generative model of the input (visible) data that connects to a layer of latent (hidden) nodes. The hidden layer is then used to train a second generative model against the next hidden layer, and so on. One technique used to achieve this is a restricted Boltzmann Machine (I’ll post some code next time).

Just like human vision systems, deep learning systems for image recognition process stuff in layers. For example, the first layer may learn correlations between pixels to begin to form tiny edge detectors. By the time you reach the third or forth layer the activations could represent complete wheels, hands, faces, etc.

Fast forward today - Google scientists have developed a computer program capable of learning a wide variety of tasks independently, in what has been hailed as “a significant step towards true artificial intelligence”.

The program learnt to play 49 different retro computer games, and came up with its own strategies for winning. The research was carried out by DeepMind, the British company bought by Google last year for £400m, whose stated aim is to build “smart machines”.

Likewise, Microsoft believes that too much of the world’s Big Data is going to waste and has just launched a new initiative to help organisations process it all, build APIs and finally make some sense out of it. The technology, called Azure Machine Learning (ML), is a new cloud based service that can be accessed via any web browser. It’s simple to use, featuring a simple drag and drop interface that data scientists and developers use. The main aim of ML is to reduce the amount of work that’s needed for organisations to deploy machine learning.

Not to be left behind, a Facebook project known as Deep Face can discern the accuracy of the true identity of any picture of you. The Deep Face AI system is now powerful enough to spot individual users from the 400 million photos uploaded to the social network every single day.

In the future, deep learning systems could be used to power self-driving cars, personal assistants in smartphones or conduct scientific research in fields from climate change to cosmology.

Exciting times..!


Biped Hip Replacement

This is a quick post to show the latest updates to my biped development robot.

Since last time I’ve added another degree of freedom to each hip by adding two further Robotis servos. This has had a dramatic effect, allowing me to transition the centre of gravity easily over the desired foot before attempting to stride.

I’ve also moved the battery and main processor off board to reduce weight and allow me to use the full power of my PC. By driving the robot directly from my PC, rather than an Arduino, I’m able to perform real-time diagnostics. This in turn has allowed me to greatly refine the walking gait.

To communicate with my PC I’m using the Robotis USB2Dynamixel adapter. This is a great gadget that has a few of modes of operation – one of which allows me to convert USB signals to TTL used by the Dynamixel AX series and hook in external power for the servos.

In total, this biped now has 10 degrees of freedom: two in each hip; one in each knee; two in each ankle. To make turning easier, I could add another degree of freedom to each hip or ankle, which would enable rotation on the transverse plane. But for now, I’m happy the way it is.

It’s a pretty agile little beast now – check out the video below…

 

 


Using Inverse Kinematics to Develop a Biped Robot Walking Gait C#

When I created the eight degree of freedom (8 DOF) biped robot in my last blog post, I wrote a C# application to calculate servo positions, which in turn generated a smooth, life-like, walking gait. In this post I will walk through the application logic in more detail. The application generates a motion plan, runs the inverse kinematics calculations and allows me to visualise the results as rendered a stick man. The complete source code is below.

According to Wikipedia, “Inverse kinematics refers to the use of the kinematics equations of a robot to determine the joint parameters that provide a desired position of the end-effector. Specification of the movement of a robot so that its end-effector achieves a desired task is known as motion planning. Inverse kinematics transforms the motion plan into joint actuator trajectories for the robot.”

Starting with the motion plan and using the two rules established in my previous post (#1 static hip height and #2 clipped sinusoidal foot motion), I knew roughly what I wanted each joint to do. Next I had to model that mathematically. Hip height was easy, as it’s constant. Left and right feet follow the pattern illustrated below.

Biped Robot Hip & Foot Height Against Time

I created a system of triangles to represent each of the robot joints and could now begin to calculate the relative angles between them for each time interval. To preserve symmetry, I decided that the feet would always remain parallel with the body (and floor) and that the horizontal plane would always bisect the knee angle. These principles helped determine many of the joint angles using some simple trigonometry.

Biped Robot Limb Angles

All that remained was to solve any outstanding angles using the law of cosines. The law of cosines can be used in a number of ways - such as calculating the third side of a triangle when two sides and their enclosed angle are known, or to determine the angles of a triangle if all three sides are known. Once the angles are known, these can be translated into servo positions, with the appropriate amount of offset and direction applied.

This could now be implemented in code – the motion plan (determining foot geometry), the inverse kinematics (determining servo angle) calculation and the joint visualisation. I won’t walk through every step as the source code is pretty easy to decipher.

Note about the code: I separate the code into a few classes to keep key objects and values partitioned (legs, hip, etc). In this post I’ve compressed everything into a single file, which will execute - just paste the entire block into a new Windows Form project. However, if you want to modify the code, for maintainability, it would be best to break out again into discrete class files.

I hope you find this useful.

 

Code Snippet
  1. using System;
  2. using System.Collections.Generic;
  3. using System.Drawing;
  4. using System.IO;
  5. using System.Threading;
  6. using System.Windows.Forms;
  7. namespace Biped
  8. {
  9.     public class Canvass : Form
  10.     {
  11.         private List<int[]> _patterns = new List<int[]>();
  12.         private Hip _hip = new Hip();
  13.         private int _p = 0;
  14.         [STAThread]
  15.         static void Main()
  16.         {
  17.             Application.Run(new Canvass());
  18.         }
  19.         public Canvass()
  20.         {
  21.             this.Paint += Render;
  22.             this.Height = 300;
  23.             this.Width = 250;
  24.             CalcFeetCoordinates();
  25.         }
  26.         private void CalcFeetCoordinates()
  27.         {
  28.             // Move left leg forward.
  29.             for (int i = 0; i < 20; i++)
  30.             {
  31.                 double x = (i - 10) * (Leg.StrideLength / 20.0);
  32.                 double y = Leg.HipHeight;
  33.                 AddStridePosition(x, y, -45);
  34.             }
  35.             // Move left leg backward.
  36.             for (int i = 0; i < 20; i++)
  37.             {
  38.                 double x = (10 - i) * (Leg.StrideLength / 20.0);
  39.                 double y = FootHeight(x);
  40.                 AddStridePosition(x, y, 45);
  41.             }
  42.             // Build right leg from phase shift clone of left.
  43.             for (int i = 0; i < 40; i++)
  44.                 for (int j = 0; j < 4; j++)
  45.                     _patterns[i][j + 4] = -_patterns[(i + 20) % 40][j];
  46.             // Roll ankles on transition.
  47.             RollAnkle(19, 20, 6);
  48.             RollAnkle(45, 0, 6);
  49.             // Write servo positions to file.
  50.             DumpToFile();
  51.         }
  52.         private double FootHeight(double x)
  53.         {
  54.             return Leg.HipHeight - Leg.FootLift * Math.Cos(Math.Abs(x * Math.PI / Leg.StrideLength));
  55.         }
  56.         private void AddStridePosition(double x, double y, int tilt)
  57.         {
  58.             // Cosine rule: cos A = (b^2 + c^2 - a^2) / 2bc
  59.             int[] pos = new int[8];
  60.             double hypSqrd = Math.Pow(x, 2) + Math.Pow(y, 2);
  61.             double hyp = Math.Sqrt(hypSqrd);
  62.             pos[0] = 0 - RadToStep(Math.Acos(hyp / (2 * Leg.Bone)) - Math.Atan2(x, y));
  63.             pos[1] = RadToStep(Math.Acos((2 * Leg.BoneSqrd - hypSqrd) / (2 * Leg.BoneSqrd))) - 512;
  64.             pos[2] = pos[0] - pos[1];
  65.             pos[3] = tilt;
  66.             _patterns.Add(pos);
  67.         }
  68.         private void RollAnkle(int r1, int r2, int steps)
  69.         {
  70.             int[] row1 = _patterns[r1];
  71.             int[] row2 = _patterns[r2];
  72.             for (int i = 0; i < steps; i++)
  73.             {
  74.                 int[] pos = new int[8];
  75.                 for (int j = 0; j < 8; j++)
  76.                     pos[j] = row1[j] - ((row1[j] - row2[j]) * (i + 1)) / (steps + 1);
  77.                 _patterns.Insert(r1 + 1 + i, pos);
  78.             }
  79.         }
  80.         private void Render(object sender, PaintEventArgs e)
  81.         {
  82.             _hip.Render(_patterns[_p++], e.Graphics);
  83.             if (_p == _patterns.Count) _p = 0;
  84.             this.Invalidate();
  85.             Thread.Sleep(100);
  86.         }
  87.         private int RadToStep(double rads)
  88.         {
  89.             return (int)(rads * 512 / Math.PI);
  90.         }
  91.         private void DumpToFile()
  92.         {
  93.             using (TextWriter tw = new StreamWriter("biped.csv", false))
  94.             {
  95.                 foreach (int[] pos in _patterns)
  96.                     tw.WriteLine("{0}, {1}, {2}, {3}, {4}, {5}, {6}, {7}",
  97.                         pos[0], pos[1], pos[2], pos[3], pos[4], pos[5], pos[6], pos[7]);
  98.                 tw.Close();
  99.             }
  100.         }
  101.     }
  102.     public class Hip
  103.     {
  104.         private Leg _leftLeg = new Leg();
  105.         private Leg _rightLeg = new Leg();
  106.         public void Render(int[] steps, Graphics graph)
  107.         {
  108.             _leftLeg.SetServos(steps[0], steps[1], steps[2], -1);
  109.             _leftLeg.Render(graph, Pens.Black);
  110.             _rightLeg.SetServos(steps[4], steps[5], steps[6], 1);
  111.             _rightLeg.Render(graph, Pens.Blue);
  112.         }
  113.     }
  114.     public class Leg
  115.     {
  116.         public static int Bone = 100;
  117.         public static int BoneSqrd = Bone * Bone;
  118.         public static int HipHeight = 180;
  119.         public static int StrideLength = 60;
  120.         public static int FootLift = 20;
  121.         private static int _foot = Bone / 5;
  122.         private double[] _joints = new double[3];
  123.         public void SetServos(int hip, int knee, int ankle, int direction)
  124.         {
  125.             _joints[0] = StepToRad(hip * direction);
  126.             _joints[1] = StepToRad(-knee * direction);
  127.             _joints[2] = StepToRad(-ankle * direction + 256);
  128.         }
  129.         public void Render(Graphics g, Pen pen)
  130.         {
  131.             Point[] points = new Point[4];
  132.             points[0] = new Point(100, 40);
  133.             points[1] = new Point();
  134.             points[1].X = points[0].X + (int)(Math.Sin(_joints[0]) * Bone);
  135.             points[1].Y = points[0].Y + (int)(Math.Cos(_joints[0]) * Bone);
  136.             points[2] = new Point();
  137.             points[2].X = points[1].X + (int)(Math.Sin(_joints[0] + _joints[1]) * Bone);
  138.             points[2].Y = points[1].Y + (int)(Math.Cos(_joints[0] + _joints[1]) * Bone);
  139.             points[3] = new Point();
  140.             points[3].X = points[2].X + (int)(Math.Sin(_joints[0] + _joints[1] + _joints[2]) * _foot);
  141.             points[3].Y = points[2].Y + (int)(Math.Cos(_joints[0] + _joints[1] + _joints[2]) * _foot);
  142.             for (int i = 0; i < 3; i++)
  143.                 g.DrawLine(pen, points[i], points[i + 1]);
  144.         }
  145.         private double StepToRad(int steps)
  146.         {
  147.             return Math.PI * steps / 512.0;
  148.         }
  149.     }
  150. }

8 DOF Biped Robot using Dynamixel AX-12A Servos and Arduino

Buoyed by the success of my 6 DOF biped I decide to take the next step (no pun intended).

I purchased another Dynamixel AX-12A servo for each leg to give me eight degrees of freedom (DOF) in total. The hope was that this would result in a much more life like walking gait. Whilst ordering the servos, I also bought some more Robotis plastic frames to ease bolting this lot together.

The new servos and frames were fixed together similar to the previous design, but now with an enhanced ankle joint. With 8 DOF, I could no longer work out joint ankles in my head. It was time to break out some inverse kinematics!

Designing a walking gait from scratch is not that simple. I started by watching how people walk and tried to establish some simple rules I could emulate in code. My first observation was that humans have a really efficient walking gait. Our bodies carry a large mass above the waistline and we tend to keep that fairly stable whilst walking.

Rule #1: The robot’s hip height should remain constant.

Secondly, we raise and lower our feet very smoothly, just enough to achieve forward movement, which peaks in the middle of our stride.

Rule #2: The robot’s feet should follow clipped sine wave.

With these two rules established I could now generate a system of triangles to calculate all servo positions at each point of the stride. I could solve any missing angles using the law of cosines. To help me with this job I wrote a C# application to crunch the numbers and visualise the task. The result was a [52, 8] matrix of servo positions that I could paste into a very small Arduino program.

I will walk through the C# application in detail in my next post.

The resulting Arduino code is posted below. Like before, the program refers to my Dynamixel class created in a previous post.

I’m really pleased with the results. Here is a video of my new 8 DOF biped walking across a glass table - it looks and sounds pretty sinister…

Code Snippet
  1. #include "Dynamixel.h"
  2. #include "Wire.h"
  3. #define WALK_SWITCH  8
  4. Dynamixel servo;
  5. int pos[52][8] = {
  6. {-95, -138, 43, -45, 41, 138, -97, -45},
  7. {-93, -140, 47, -45, 50, 151, -101, -45},
  8. {-92, -141, 49, -45, 59, 164, -105, -45},
  9. {-90, -143, 53, -45, 67, 174, -107, -45},
  10. {-88, -144, 56, -45, 74, 184, -110, -45},
  11. {-85, -145, 60, -45, 80, 192, -112, -45},
  12. {-83, -146, 63, -45, 87, 198, -111, -45},
  13. {-81, -147, 66, -45, 92, 204, -112, -45},
  14. {-78, -147, 69, -45, 97, 207, -110, -45},
  15. {-76, -147, 71, -45, 101, 210, -109, -45},
  16. {-73, -148, 75, -45, 104, 210, -106, -45},
  17. {-70, -147, 77, -45, 107, 210, -103, -45},
  18. {-67, -147, 80, -45, 109, 207, -98, -45},
  19. {-64, -147, 83, -45, 110, 204, -94, -45},
  20. {-61, -146, 85, -45, 110, 198, -88, -45},
  21. {-58, -145, 87, -45, 110, 192, -82, -45},
  22. {-55, -144, 89, -45, 109, 184, -75, -45},
  23. {-52, -143, 91, -45, 106, 174, -68, -45},
  24. {-48, -141, 93, -45, 103, 164, -61, -45},
  25. {-45, -140, 95, -45, 100, 151, -51, -45},
  26. {-45, -140, 95, -33, 100, 150, -50, -33},
  27. {-44, -140, 95, -20, 99, 148, -49, -20},
  28. {-44, -140, 95, -7, 98, 146, -48, -7},
  29. {-43, -139, 96, 6, 98, 144, -47, 6},
  30. {-43, -139, 96, 19, 97, 142, -46, 19},
  31. {-42, -139, 96, 32, 96, 140, -45, 32},
  32. {-41, -138, 97, 45, 95, 138, -43, 45},
  33. {-50, -151, 101, 45, 93, 140, -47, 45},
  34. {-59, -164, 105, 45, 92, 141, -49, 45},
  35. {-67, -174, 107, 45, 90, 143, -53, 45},
  36. {-74, -184, 110, 45, 88, 144, -56, 45},
  37. {-80, -192, 112, 45, 85, 145, -60, 45},
  38. {-87, -198, 111, 45, 83, 146, -63, 45},
  39. {-92, -204, 112, 45, 81, 147, -66, 45},
  40. {-97, -207, 110, 45, 78, 147, -69, 45},
  41. {-101, -210, 109, 45, 76, 147, -71, 45},
  42. {-104, -210, 106, 45, 73, 148, -75, 45},
  43. {-107, -210, 103, 45, 70, 147, -77, 45},
  44. {-109, -207, 98, 45, 67, 147, -80, 45},
  45. {-110, -204, 94, 45, 64, 147, -83, 45},
  46. {-110, -198, 88, 45, 61, 146, -85, 45},
  47. {-110, -192, 82, 45, 58, 145, -87, 45},
  48. {-109, -184, 75, 45, 55, 144, -89, 45},
  49. {-106, -174, 68, 45, 52, 143, -91, 45},
  50. {-103, -164, 61, 45, 48, 141, -93, 45},
  51. {-100, -151, 51, 45, 45, 140, -95, 45},
  52. {-100, -150, 50, 33, 45, 140, -95, 33},
  53. {-99, -148, 49, 20, 44, 140, -95, 20},
  54. {-98, -146, 48, 7, 44, 140, -95, 7},
  55. {-98, -144, 47, -6, 43, 139, -96, -6},
  56. {-97, -142, 46, -19, 43, 139, -96, -19},
  57. {-96, -140, 45, -32, 42, 139, -96, -32}
  58. };
  59. int centre = 512;
  60. byte p = 0;
  61. void setup() {
  62.   pinMode(WALK_SWITCH, INPUT);
  63.   Serial.begin(1000000);
  64. }
  65. void loop() {
  66.   if (digitalRead(WALK_SWITCH)) {
  67.     update(p++);
  68.     delay(38);
  69.     if (p > 51) p = 0;
  70.   }
  71. }
  72. void update(byte p) {
  73.   int fix = 0;
  74.   for (byte i = 0; i < 8; i++)
  75.   {
  76.     if (i==0) fix = -120;
  77.     else if (i==4) fix = 120;
  78.     else fix = 0;
  79.     servo.setPos(i + 1, centre + pos[p][i] + fix, 0);
  80.   }
  81. }

6 DOF Biped Robot using Dynamixel AX-12A Servos and Arduino

Having mastered driving Robotis Dynamixel AX Servos with an Arduino, I wanted to do something practical with that knowledge. How about building a biped robot?

There are plenty of biped robot kits available, like the Lynxmotion BRAT and the Robotis Bioloid, but I wanted to build something from the parts I already had lying around.

Each of the six Dynamixel AX-12A Servos I recently purchased came supplied with a U shaped bracket and mounting plate. I decided that if I bolted these together I’d get a pretty decent pair of legs. This configuration would give me a total of six degrees of freedom (DOF) - three in each leg - hip and knee joints on the sagittal plane and an ankle joint on the coronal plane.

I cut and drilled a strip of aluminium to form a pelvis and tie the legs together. I had a sheet of 3mm HDPE (the same stuff plastic chopping boards are made from) lying around, so cut this into rectangles to form the feet.

For power, I reused a 5000mAH LiPo battery, which I cable tied to the aluminium pelvis. This placed the centre of gravity nice and high, which is actually useful. On top of that, I taped an Arduino mounted in a plastic case and finally added the CDS55xx Driver Board (to interface the Arduino to the servos).

It's not the prettiest robot, but that completed the mechanical build.


Next came the software part, which turned out to be pretty simple. The key to getting a biped to walk well is developing an efficient walking gait. There are two different ways to approach this:

Static gait - the centre of gravity is projected inside the polygon formed by the robot’s feet. This is the simplest form, although looks artificial.

Dynamic gait - the centre of gravity is not necessarily projected within the polygon of the robot’s feet, however, dynamic balance is maintained. This is far more complex, but results in more natural movement.

For my first experiment, I chose a static gait. As I only had to contend with 6 DOF I decided I could dispense with complex inverse kinematics calculations and do it by hand. And, by keeping the motion of each leg symmetrical it’s easier to keep the centre of gravity central.

I settled on a repeating pattern of four poses that would make up the gait:

- Rotate ankles clockwise, shifting centre gravity to the left and lifting right leg.
- Extend right leg forward and push backwards with left leg.
- Rotate ankles anticlockwise, shifting centre gravity to the right and lifting left leg.
- Extend left leg forward and push backwards with right leg.

Here is a video of the completed biped walking happily across my floor.

Here is the code. Please note, I reference the Dynamixel class as featured in my last blog post.

Biped.ino

#include "Dynamixel.h"
#include "Wire.h"
 
Dynamixel servo;
int velocity = 90;
int centre = 511;
byte c = 0;
byte p = 0;
 
/* Offset array from centre position
 * L_HIP, L_KNEE, L_ANKLE, R_HIP, R_KNEE, R_ANKLE */
int pos[5][6] = {
  {   0,   0,   0,   0,   0,   0 }, // Stand upright
  { -55, -55,  25, -55, -55,  25 }, // Right leg forward
  { -55, -55, -25, -55, -55, -25 }, // Lean right
  {  55,  55, -25,  55,  55, -25 }, // Left leg forward
  {  55,  55,  25,  55,  55,  25 }  // Lean left
};
 
void setup() {
  Serial.begin(1000000);
  delay(5000);
}
 
void loop() {
  if (c < 4)
  {
    update(p++);
    if (p > 4)
    { 
      c++;
      p = 1;
    }
  }
  else update(0);
}
 
void update(byte p)
{
  // Update each servo position.
  for (byte i = 0; i < 6; i++)
    servo.setPos(i + 1, centre + pos[p][i], velocity);
 
  // Wait for motion to complete.
  delay(500);
}

Thanks for reading.
John


Driving Robotis Dynamixel Servos with Arduino

In my continuing quest for knowledge about robotics I recently bought some Robotis Dynamixel AX-12A servos, with the intention of hooking up to an Arduino. These awesome little servos pack a real punch, with over 15 kg/cm torque. There are plenty of hobby servos that have similar torque, but what sets these apart is that they also have the ability to track and report their speed, temperature, shaft position, voltage, and load. This level of feedback is essential for building advanced robotics applications.

Robotis Dynamixal AX-12A

Unlike hobby servos, these servos operate using a serial half-duplex TTL RS232 protocol. This is actually good news, as your micro-controller doesn’t need to worry about generating individual PWM signals for each servo. Instead, all sensor management and position control is handled by the servo's built-in micro-controller. Position and speed can be controlled with a 1024 step resolution. Wiring is pretty simple with two connectors on each servo allowing a daisy chain to be constructed.

The main controller communicates with the Dynamixel servos by sending and receiving data packets. There are two types of packets; the Instruction Packet (sent from the main controller to the servos) and the Status Packet (sent from the servos to the main controller). By default the communication speed is 1Mbps. A factory fresh servo has a preset ID of 01, which can easily be updated if you intend to run more than one servo from the same micro-controller.

The next challenge was to hook these up to an Arduino. This is not quite as simple as a regular hobby servo, as you must convert the full-duplex signal coming from the Arduino RX/TX pins to the half-duplex required by the Dynamixel servos. Luckily, there is a simple piece of hardware that will do the job for you. I purchased the CDS55xx Driver Board from Robosavvy. This board integrates a half-duplex and a voltage regulator circuit and thus makes it possible to directly connect Dynamixel AX servos to an Arduino.

Next I needed some code to drive these units. If you’ve read my previous post, you’ll know I don’t like bloated software libraries, so instead I created a bare bones class that allows me to set the servo ID and read & write servo positions – enough for current needs. The code is below – hope you find it useful…

Dynamixel.h

#include "Arduino.h"
 
// Registers
#define P_ID 3                   // ID {RD/WR}
#define P_GOAL_POSITION_L 30     // Goal Position L {RD/WR}
#define P_PRESENT_POSITION_L 36  // Present Position L {RD}
 
// Instructions
#define INST_READ 0x02           // Reading values in the Control Table.
#define INST_WRITE 0x03          // Writing values to the Control Table.
 
class Dynamixel{
public:
  Dynamixel();
  void setPos(byte id, int pos, int vel);
  void setID(byte id, byte newId);
  int getPos(byte id);
 
private:
  void WriteHeader(byte id, byte length, byte type);
};

Dynamixel.cpp

#include "Dynamixel.h"
#include "Wire.h"  
 
Dynamixel::Dynamixel() {}
 
/* id = servo ID [0 - 255]
** pos = new position [0 - 1023]
** vel = servo velocity [0 - 1023] */
void Dynamixel::setPos(byte id, int pos, int vel)
{
  int writeLength = 7;
  byte pos_h = pos / 255;
  byte pos_l = pos % 255; 
  byte vel_h = vel / 255;
  byte vel_l = vel % 255;
 
  // Write standard header.
  WriteHeader(id, writeLength, INST_WRITE);
  // Starting address of where the data is to be written.
  Serial.write(P_GOAL_POSITION_L);
  // Write position low byte.
  Serial.write(pos_l);
  // Write position high byte.
  Serial.write(pos_h);
  // Write velocity low byte.
  Serial.write(vel_l);
  // Write velocity high byte.
  Serial.write(vel_h);
  // Check Sum.  
  Serial.write((~(id + writeLength + INST_WRITE + P_GOAL_POSITION_L + pos_l + pos_h + vel_l + vel_h))&0xFF);
  // Wait for instruction to be processed.
  delay(2);
  // Discard return data.
  while (Serial.read() >= 0){}
}
 
/* id = servo ID [0 - 255] */
int Dynamixel::getPos(byte id)
{
  int writeLength = 4;
  int readLength = 2;
 
  // Write standard header.
  WriteHeader(id, writeLength, INST_READ);
  // Starting address of where the data is to be read.
  Serial.write(P_PRESENT_POSITION_L);
  // The length of data to read.
  Serial.write(readLength);
  // Check Sum.  
  Serial.write((~(id + writeLength + INST_READ + P_PRESENT_POSITION_L + readLength))&0xFF);
  // Wait for instruction to be processed.
  delay(2);
  // Discard extra data.
  for (int i = 0; i < 5; i++) Serial.read();
  // Read low byte.
  int low_Byte = Serial.read();
  // Read low byte.
  int high_byte = Serial.read();
  // Discard returned checksum.
  Serial.read();
  // Return position.
  return (int)high_byte << 8 | low_Byte;
}
 
/* id = servo ID [0 - 255]
** newId = new servo ID [0 - 255] */
void Dynamixel::setID(byte id, byte newId)
{
  int writeLength = 4;
 
  // Write standard header.
  WriteHeader(id, writeLength, INST_WRITE);
  // Starting address of where the data is to be written.
  Serial.write(P_ID);
  // New ID.
  Serial.write(newId);
  // Check Sum.
  Serial.write((~(id + writeLength + INST_WRITE + P_ID + newId))&0xFF);
}
 
void Dynamixel::WriteHeader(byte id, byte length, byte type)
{
  Serial.write(0xFF);
  Serial.write(0xFF);
  Serial.write(id);
  Serial.write(length);
  Serial.write(type);
}

Using an MPU-6050 Gyroscope & Accelerometer with Arduino

I recently purchased a SparkFun (InvenSense) MPU-6050, six degrees of freedom Gyroscope & Accelerometer from Robosavvy. It's a great bit of kit, which combines a 3-axis gyroscope and a 3-axis accelerometer on the same board. It hooks up easily to an Arduino using the I2C bus. So far, so good...

I then begin searching the Internet for example code with these two devices working together. My search always led me to a huge, unwieldy library, which seemed very bloated, considering all I wanted to do was read some values from the board.

I began to dig deeper and experiment, which enabled me to create the code sample below. It relies on using default values, which are fine for my application - and it's light! The accelerometer channels are very twitchy, so the general advice is to incorporate a low pass filter, which I've done.

Hopefully, you will also find it useful for your projects.

Main Program

#include "Wire.h"
#include "MPU6050.h"
 
MPU6050 mpu;  
int ax, ay, az, gx, gy, gz;
float smooth_ax, smooth_ay, smooth_az;
 
void setup()
{
  Wire.begin();
  Serial.begin(38400);
  mpu.wakeup();  
}
 
void loop()
{
  /* Read device to extract current
   accelerometer and gyroscope values. */
  mpu.read6dof(&ax, &ay, &az, &gx, &gy, &gz);
 
  /* Apply low pass filter to smooth
   accelerometer values. */
  smooth_ax = 0.95 * smooth_ax + 0.05 * ax;
  smooth_ay = 0.95 * smooth_ay + 0.05 * ay;
  smooth_az = 0.95 * smooth_az + 0.05 * az;
 
  /* Output to serial monitor. */
  Serial.print(smooth_ax);
  Serial.print("t");
  Serial.print(smooth_ay);
  Serial.print("t");
  Serial.print(smooth_az);
  Serial.print("t");
  Serial.print(gx);
  Serial.print("t");
  Serial.print(gy);
  Serial.print("t");
  Serial.println(gz);
}

MPU6050.h

#include "Arduino.h"
#include "Wire.h"
 
#define MPU6050_DEVICE_ADDRESS   0x68
#define MPU6050_RA_ACCEL_XOUT_H  0x3B
#define MPU6050_RA_PWR_MGMT_1    0x6B
#define MPU6050_PWR1_SLEEP_BIT   6
 
class MPU6050 {
public:
  MPU6050();
  void wakeup();
  void read6dof(int* ax, int* ay, int* az, int* gx, int* gy, int* gz);
 
private:
  byte id;
  byte buffer[14];
  void readByte(byte reg, byte *data);
  void readBytes(byte reg, byte len, byte *data);
  void writeBit(byte reg, byte num, byte data);
  void writeByte(byte reg, byte data);
};

MPU6050.cpp

#include "MPU6050.h"
 
MPU6050::MPU6050() {
  id = MPU6050_DEVICE_ADDRESS;
}
 
/* Wake up device and use default values for
 accelerometer (±2g) and gyroscope (±250°/sec). */
void MPU6050::wakeup() {
  writeBit(MPU6050_RA_PWR_MGMT_1, MPU6050_PWR1_SLEEP_BIT, 0);
}
 
/* Read device memory to extract current
 accelerometer and gyroscope values. */
void MPU6050::read6dof(int* ax, int* ay, int* az, int* gx, int* gy, int* gz) {
  readBytes(MPU6050_RA_ACCEL_XOUT_H, 14, buffer);
  *ax = (((int)buffer[0]) << 8) | buffer[1];
  *ay = (((int)buffer[2]) << 8) | buffer[3];
  *az = (((int)buffer[4]) << 8) | buffer[5];
  *gx = (((int)buffer[8]) << 8) | buffer[9];
  *gy = (((int)buffer[10]) << 8) | buffer[11];
  *gz = (((int)buffer[12]) << 8) | buffer[13];
}
 
/* Read a single byte from specified register. */
void MPU6050::readByte(byte reg, byte *data) {
  readBytes(reg, 1, data);
}
 
/* Read multiple bytes starting at specified register. */
void MPU6050::readBytes(byte reg, byte len, byte *data) {
  byte count = 0;
  Wire.beginTransmission(id);
  Wire.write(reg);
  Wire.requestFrom(id, len);
  while (Wire.available()) data[count++] = Wire.read();
  Wire.endTransmission();
}
 
/* Write bit to specified register and location. */
void MPU6050::writeBit(byte reg, byte num, byte data) {
  byte b;
  readByte(reg, &b);
  b = (data != 0) ? (b | (1 << num)) : (b & ~(1 << num));
  writeByte(reg, b);
}
 
/* Write byte to specified register. */
void MPU6050::writeByte(byte reg, byte data) {
  Wire.beginTransmission(id);
  Wire.write(reg); 
  Wire.write(data);
  Wire.endTransmission(); 
}

Cleaning Noisy Time Series Data – Low Pass Filter C#

When working with time series data, like stock market prices, values can often contain a lot of noise, obscuring a real trend. One of the best ways to remove this noise is to run the data through a low pass filter.

Methods like simple moving averages and exponential moving averages are quick to implement and do a relatively good job. However, the disadvantages of these methods is that they only “look back” and do not take into account future values. This results in smoothed data which is out of phase with the original data-set, leading to peaks and troughs occurring later than reality.

A way to get around these issues is to implement a better filter, such as a Fast Fourier Transform or a Savitzky–Golay filter. However, these methods can be fairly complex and heavy to implement.

A simple method I use is shown below. I’m not sure if it’s a recognised technique, but I like to think of it as a one dimensional radial basis function. It looks back and forward around a value’s nearest neighbours, taking a weighted average, which decays exponentially by distance. And, like all good vacuum cleaners, this method cleans right up to the edges, by adding inferred linear slopes to the beginning and ends of the clean data-set.

The graph below shows a very noisy sine wave and its cleaner equivalent.

Here's the code - I hope you find it useful.

  1. using System;
  2. using System.IO;
  3. class Program
  4. {
  5.     static void Main(string[] args)
  6.     {
  7.         int range = 5; // Number of data points each side to sample.
  8.         double decay = 0.8; // [0.0 - 1.0] How slowly to decay from raw value.
  9.         double[] noisy = NoisySine();
  10.         double[] clean = CleanData(noisy, range, decay);
  11.         WriteFile(noisy, clean);
  12.     }
  13.     static private double[] CleanData(double[] noisy, int range, double decay)
  14.     {
  15.         double[] clean = new double[noisy.Length];
  16.         double[] coefficients = Coefficients(range, decay);
  17.         // Calculate divisor value.
  18.         double divisor = 0;
  19.         for (int i = -range; i <= range; i++)
  20.             divisor += coefficients[Math.Abs(i)];
  21.         // Clean main data.
  22.         for (int i = range; i < clean.Length - range; i++)
  23.         {
  24.             double temp = 0;
  25.             for (int j = -range; j <= range; j++)
  26.                 temp += noisy[i + j] * coefficients[Math.Abs(j)];
  27.             clean[i] = temp / divisor;
  28.         }
  29.         // Calculate leading and trailing slopes.
  30.         double leadSum = 0;
  31.         double trailSum = 0;
  32.         int leadRef = range;
  33.         int trailRef = clean.Length - range - 1;
  34.         for (int i = 1; i <= range; i++)
  35.         {
  36.             leadSum += (clean[leadRef] - clean[leadRef + i]) / i;
  37.             trailSum += (clean[trailRef] - clean[trailRef - i]) / i;
  38.         }
  39.         double leadSlope = leadSum / range;
  40.         double trailSlope = trailSum / range;
  41.         // Clean edges.
  42.         for (int i = 1; i <= range; i++)
  43.         {
  44.             clean[leadRef - i] = clean[leadRef] + leadSlope * i;
  45.             clean[trailRef + i] = clean[trailRef] + trailSlope * i;
  46.         }
  47.         return clean;
  48.     }
  49.     static private double[] Coefficients(int range, double decay)
  50.     {
  51.         // Precalculate coefficients.
  52.         double[] coefficients = new double[range + 1];
  53.         for (int i = 0; i <= range; i++)
  54.             coefficients[i] = Math.Pow(decay, i);
  55.         return coefficients;
  56.     }
  57.     static private void WriteFile(double[] noisy, double[] clean)
  58.     {
  59.         using (TextWriter tw = new StreamWriter("data.csv"))
  60.         {
  61.             for (int i = 0; i < noisy.Length; i++)
  62.                 tw.WriteLine(string.Format("{0:0.00}, {1:0.00}", noisy[i], clean[i]));
  63.             tw.Close();
  64.         }
  65.     }
  66.     static private double[] NoisySine()
  67.     {
  68.         // Create a noisy sine wave.
  69.         double[] noisySine = new double[180];
  70.         Random rnd = new Random();
  71.         for (int i = 0; i < 180; i++)
  72.             noisySine[i] = Math.Sin(Math.PI * i / 90) + rnd.NextDouble() - 0.5;
  73.         return noisySine;
  74.     }
  75. }

Extracting Plain Text from Web Page HTML C#

Natural Language processing solutions, like Athena, require a good supply of high quality text.

As well as loading in ad-hoc documents, I’ve given Athena free reign to browse the Internet as required. Its two main sources of information are Wikipedia and BBC News.

Wikipedia is great for providing domain knowledge and key facts, whilst the BBC News site is an excellent source of up to the minute current affairs.

Anybody who has attempted to extract plain text from real world HTML will know that what should be a simple task can quickly snowball into a mammoth project.

There have been many debates on sites like Stackoverflow on how best to do this. Most people start their journey by using regular expressions (regex) - but this is really only viable with well formed and simple HTML. Madness soon follows...

In the real world, HTML is not always well formed and in practice you will also want to ignore such things as adverts, menus and page navigation. To overcome this, you may consider creating a hybrid regex / imperative code parser. Suddenly, this is getting serious...

Luckily, if you’re using C#, you already have the perfect solution in your toolbox - the WebBrowser control in Windows Forms. This control already knows how to render web pages into text and is incredibly tolerant to badly formed HTML.

Using the HtmlDocument property in the WebBrowser control, you can easily navigate the document to find exactly the clean text portions you’re looking for. And, of course, just because this control sits into the System.Windows.Forms namespace, doesn't mean you can’t use it in other types of application - just be sure to add the relevant assembly reference. One complication is that the WebBrowser control needs to run in its own thread (which is easy to work around).

In the simple example below, I have created a console application that allows you to type in a search phrase on the command line, which is sent to Google, extracting links to the BBC News website and returning relevant, clean, plain text.

Sites like BBC News are very well structured, thanks to their content management system. Therefore, by reading the CSS classname associated with HTML tags, you can easily isolate the information you require.

  1. using System;
  2. using System.Text;
  3. using System.Threading;
  4. using System.Windows.Forms;
  5. class Program
  6. {
  7.     private string _plainText;
  8.     static void Main(string[] args)
  9.     {
  10.         new Program();
  11.     }
  12.     private Program()
  13.     {
  14.         while (true)
  15.         {
  16.             Console.Write("> ");
  17.             string phrase = Console.ReadLine();
  18.             if (phrase.Length > 0)
  19.             {
  20.                 Thread thread = new Thread(new ParameterizedThreadStart(GetPlainText));
  21.                 thread.SetApartmentState(ApartmentState.STA);
  22.                 thread.Start(phrase);
  23.                 thread.Join();
  24.                 Console.WriteLine();
  25.                 Console.WriteLine(_plainText);
  26.                 Console.WriteLine();
  27.             }
  28.         }
  29.     }
  30.     private void GetPlainText(object phrase)
  31.     {
  32.         string uri = "";
  33.         WebBrowser _webBrowser = new WebBrowser();
  34.         _webBrowser.Url = new Uri(string.Format(@"http://www.google.com/search?as_q={0}&as_sitesearch=www.bbc.co.uk/news", phrase));
  35.         while (_webBrowser.ReadyState != WebBrowserReadyState.Complete) Application.DoEvents();
  36.         foreach (HtmlElement a in _webBrowser.Document.GetElementsByTagName("A"))
  37.         {
  38.             uri = a.GetAttribute("href");
  39.             if (uri.StartsWith("http://www.bbc.co.uk/news")) break;
  40.         }
  41.         StringBuilder sb = new StringBuilder();
  42.         WebBrowser webBrowser = new WebBrowser();
  43.         webBrowser.Url = new Uri(uri);
  44.         while (webBrowser.ReadyState != WebBrowserReadyState.Complete) Application.DoEvents();
  45.         // Pick out the main heading.
  46.         foreach (HtmlElement h1 in webBrowser.Document.GetElementsByTagName("H1"))
  47.             sb.Append(h1.InnerText + ". ");
  48.         // Select only the article text, ignoring everything else.
  49.         foreach (HtmlElement div in webBrowser.Document.GetElementsByTagName("DIV"))
  50.             if (div.GetAttribute("classname") == "story-body")
  51.                 foreach (HtmlElement p in div.GetElementsByTagName("P"))
  52.                 {
  53.                     string classname = p.GetAttribute("classname");
  54.                     if (classname == "introduction" || classname == "")
  55.                         sb.Append(p.InnerText + " ");
  56.                 }
  57.         webBrowser.Dispose();
  58.         _plainText = sb.ToString();
  59.     }
  60. }

This is what the result looks like after searching for British Airways...

Happy screen scraping!
John