Sunday, December 27, 2020

JavaFX monitor component

 Aloha,

There was always something I would like to do but never found the time to do it...a control that looks more or less like a heart rate monitor. Well it's not really about monitoring the heart rate but I like the heart rate signal, so it's more about the older tube based oscilloscopes. And the thing that makes me want to create such a control is the fade out of the beam when it moves across the screen. In the tube based devices this was coming from a metallized layer that was applied to the front of the glass tube. When electrons hit this layer, it absorbed a part of the kinetic energy of the electron and released it in the form of light (so called fluorescence). The layer on the glass is metallized so that the electric charge of the electrons can flow off.

Now when the electron beam was moving across the screen the electrons left behind a fluorescent line which formed the signal. Ok so that is the thing that I would like to create...not an oscilloscope but this fading effect when visualizing a signal.

So what we need is a line that fades out over time...sounds easy (and in the end the solution is easy) but getting the right idea on how to implement it took me some walks with the dog :)

The first idea is you simple draw a line and apply a horizontal linear gradient to the stroke that fades out to transparency. As long as the signal has a predominantly horizontal orientation this works really fine. But as soon as the signal also has significant vertical values we will run into a problem. Let me try to visualize this in a little graphic:


On the picture above you see two signals that have the same horizontal width but a different signal length. On top I've added two rectangles that shows the horizontal linear gradient that is used to fade out the signal. The effect of fading out should be related to the length of the line and not to the width of the signal. 

To achieve this behavior I need to fade out the signal along the line and not only in horizontal direction. So the fade out should follow the line. My idea was to use a queue with a fixed size, when filled up to the given size it should remove the head element when a new element was added to the end. With such a queue in place I could simply draw a line between each element of the queue and fade out these line segments when I'm iterating over them during the drawing process.

I simply created a FixedSizeQueue that is based on an ArrayBlockingQueue and added the "auto-remove" functionality in the add() method. So the queue looks like this:

public class FixedSizeQueue<E> extends ArrayBlockingQueue<E> {
private int size;

public FixedSizeQueue(final int size) {
super(size);
this.size = size;
}

@Override public boolean add(final E element) {
if (super.size() == this.size) { this.remove(); }
return super.add(element);
}

public E getElementAt(final int index) {
if (index < 0 || index > size() - 1) { throw new IllegalArgumentException("Index out of bounds."); }
return (E) toArray()[index];
}
}

With this queue in place I can now use it as some kind of a signal buffer. I was playing around with different factors to fade out the signal but ended up with a simply linear approach. So I calculate the opacity factor by simply dividing 1/noOfElements. And when iterating through the points in the queue I just multiply the current index with the opacity factor as you can see in the code snippet below:

final Point[] points = queue.toArray(new Point[0]);
double opacityFactor = 1.0 / points.length;
for (int i = 0; i < length - 1; i += 1) {
if (points[i].x < points[i + 1].x) {
lineCtx.setStroke(Color.color(red, green, blue, i * opacityFactor));
lineCtx.strokeLine(points[i].x, points[i].y, points[i + 1].x, points[i + 1].y);
}
}

You could because I draw the lines from left to right the opacity of the elements will in this case fade in the line segments by increasing their opacity from 0 - 1.

So the result looks good enough to me:


But this is only the effect of fading out a line along it's elements...what about the rest of the control?

For the monitor control I use three Canvas nodes and an ImageView that are placed inside a Region. So the whole control uses 5 nodes on the scene graph. 

It's good practice to think about which elements in your control needs to be drawn and when. This is sometimes a bit boring because with the given compute power of today's machines you won't really see a big difference between the optimized drawing and the let's call it brute force drawing. Optimized drawing only draws the elements that are needed where brute force drawing simply draws everything everytime. A good way of testing your controls is to run it on an older Raspberry Pi. This device is fast for it's size but slow compared to your desktop computer and on such a Raspberry Pi you will directly get a visual feedback on how efficient your code is. On an older Pi everything counts...means if you only draw the stuff that is important you will really see the effect on the Pi.

So as I mentioned I use three Canvas nodes

  • background (rect filled with background color and raster with text)
  • line (the line segments)
  • dot (the leading dot with the glow effect)
You might argue I should draw the dot in the same Canvas as the line but this only works for the mode where I fade out the signal line. The monitor control also has another mode where it always leaves the signal on the screen and only removes the part in front of the current position. This mode is common in todays heart rate monitors as you know it from the local hospital. For this mode I don't want to clear the background of the line Canvas on each draw and so I need to draw the dot on a separate Canvas.

Because I separated the background with the raster from the foreground with the line and the dot I only need to draw it when either a parameter has changed (e.g. the background color etc.) or when the control was resized. This is what I mean with optimized drawing, I could also draw the background all the time and put it in the same Canvas as the line but this is just not efficient.

In addition to the Canvas nodes I also use an ImageView to overlay the whole control with what I call a crystal overlay. The main idea is to make the whole UI look more realistic. The main thing here is noise...adding noise to a surface makes it look more natural because in the real world nearly no surface is perfect. For example if you take a look at a liquid crystal display (lcd), you will see some kind of structure on the background. This structure I try to imitate with a an image that contains semi-transparent noise. If you simply put such an image on top of your control it will look more realistic.

Here is an example screenshot of the monitor control with both variants:


As you can see the upper image has the crystal overlay effect switched off where the lower image has it switched on. In the monitor control you can decide if you would like to use it or not.

So in principle we simply stack all the different layers over each other and so get the final result.

I have also added some themes that contain common used color combinations for oscilloscopes but you can of course also set all the different colors separately.

There is a Demo class in the code that gives you the ability to play around with the parameters and if you would like to see it in action here is a little video:

As always the code/binary is available in the following places:

github

bintray

maven central

I guess that's it for today...so keep coding...

No comments:

Post a Comment