CTC Leiderdorp, June 2017

Measuring color with your smartphone

You have probably found out that your smartphone is not able to “see” color as accurately as you would like for instance, if you want to determine the color of a wall. This is partly caused by the fact that smartphones are not designed to be measuring instruments. But it is also because the lighting on the object, like the wall in this example can hugely differ from picture to picture taken by your smartphone. Several companies came up with the idea of correcting the colors in the pictures by using a color card. Color cards are made up of several small colored squares and an open space in the middle. The color card is positioned over the object for which the color needs to be determined. See below for further explanation.

Above are pictures of four color cards. The first card called ColourClick was introduced AkzoNobel in 2010 . Then second one is the TechKon card followed by the recently introduced color-eye card from XRite and the Douglas Card (2014) also from XRite.  With the color-eye card XRite introduced a number of interesting innovations like a black blob to detect light distribution. We are very interested to see the results! (The Douglas card was used for the example in this blog.)

How do digital color cards work?

The color cards have an open space in the middle of the card. To determine the color of an object the open space needs to be placed on top of the object. Then a picture is taken of the complete card to capture the color of the object as well as the surrounding colors on the card.

The idea is that the surrounding colors will help to determine the color of the object. Without balancing out the color measurement process with the surrounding colors on the card, color fidelity—the color quality of the colors captured by the smartphone only— would be very low.

As we do not know the actual algorithms involved in the Apps of TechKon and XRite we had to come up with one of our own. A very simple one just for fun.

What we did was the following:

  • We measured the Douglas color card with a Color Catch Nano. This instrument enables to measure RGB values on pixel level. See also our overview about Low Cost Color capture Devices.
  • Because the Douglas Card is meant for skin tones, we placed the card over a brownish color.
  • We took a picture of the card framing the brown object color and the surrounding colors on the card under natural daylight conditions.

The Colorix Nano is used to determine RGB values of the color patches of the Douglas Color Card.

  • We also measured the color of the object with the Nano, so we knew the “real” color values. Real between quote marks because there is no defined standard for color measurement.
  • We used a drawing program to find the R,G and B values in the smartphone picture of the card colors and the object
  • For every color on the card, we plotted the R, G and B values from the Nano against the RGB values of the object (see the graphs). It is easy to see that there is a strong correlation between the RGB values measured with the Nano and RGB values from the iPhone picture.  Adding smarter correlations, like exponential graphs, local color interpretation and look up tables has already improved the correlation. But lets stick to linear correlation for the sake of this article.
  • We then calculated the linear correlation between R, G and B values measured with the Nano vs RGB values from the iPhone picture. The relation for R was: R-Nano = 1.0142 R-iPhone + 25. This is remarkable because the slope (1.0142) is very close to one and does not correct much. There is actually only a white point shift indicated by the intercept. For G and B, similar values (17 and 12) were found. Of course the algorithms in the apps from XRite and TechKon will be much smarter but perhaps an App with only a white balance shift can already add value to some processes. This card could be based on greyscale colors (from black to white) and is perhaps easier to produce.

In the table on the left side we show the color of the object taken with the iPhone, then in the middle the “real” color obtained with the Nano and at the right the predicted color using derived correction. As you can see the predicted color is much closer to the real/Nano color than the measured color in the first column. Apparently our simple linear algorithm worked!

Remark: color is best expressed in Lab instead of RGB. For the sake of simplicity I used RGB values but there is a simple relation between RGB and Lab, so the outcomes should be very similar.

Please contact us if you need more information or want to comment the content of this publication: info@coltechcon.com.