In all of my excitement after discovering that Efergy Wireless Energy Monitors could be decoded using the data from an RTL-SDR dongle and a program written by Nathaniel Elijah, there were still several investigations I needed to conduct to determine why I found decoding with the R820T more difficult.
I suppose the biggest thing to come out of my experiment is this:
I retract my finding that the E4000 is better at pulling signals from the air for decoding Efergy Wireless Energy Monitors. In fact, the R820T can perform similarly well, although taking into note what I will say in the following sections.
This is good news for everyone!
So you ain’t got no decode?
I started off just by plugging the dongle and executing the given instructions, almost blindly, hoping for a result. When I got one from the E4000 reliably, and none from the R820T despite playing with the gain, I came to the wrong conclusion that the R820T was worse.
The reason why this is the case becomes apparent when I fire SDR# on both dongles.
The above is the E4000, note the bursts at 6 second intervals on the waterfall coming out reasonably over the background noise. The demodulation bandwidth has been adjusted to WFM mode, 200khz, centred near 433.55Mhz as the setting would have been for rtl_fm, also with the frequency correction at 0ppm (as the rtl_fm does not apply any correction to my knowledge).
This is the output from the R820T. Do I see a problem? Hell yeah I see a problem! See how the FSK is at the band edge – in fact, it’s almost out of the edge entirely? It’s outside the demodulation bandwidth due to crystal tolerances!
It was at this point I went “D’oh. My bad!”
The solution to this is obvious. When issuing the rtl_fm part of the command, replace 433550000 with say 433510000 for this particular dongle. In fact, visualize your signal before you try to decode it – you will find values from 433500000 to 433600000 might be necessary for frequency, noting the aim to keep the demodulation frequency at the centre of the bursts to allow for some drift as the temperature changes and to ease the burden on the demodulation algorithm.
The second thing to notice is that the peaks are only just visible despite the tuner gains being maxed out. Yikes. Measuring the antenna gave us the answer – it’s tuned horribly for use at 433Mhz.
The quarter wavelength of 433Mhz is about 17.5cm. I cut a piece of solid core wire to 18cm, stripped about 0.5-1cm off the end and (unlovingly) jammed it into the centre pin of the MMCX connector, leaving it standing vertically proud of the tuner. For comparison, the stock antenna is shown to the side. I would suggest you avoid doing this, as you may permanently flail the centre pin of your tuner and leave it unable to contact MMCX connectors properly in the future. If you damage your tuner this way, that’s your problem – I take no responsibility for what you do here.
Having the right antenna is key to making sure you get a good signal. Unfortunately, a configuration like this leaves the antenna right next to the tuner and computer and that leaves it vulnerable to picking up hash noise which raises the noise floor. But I figured, as long as it improves the reception of the desired signal without introducing too much hash noise, it’s definitely worth it as it improves the “golden metric” – the signal to noise ratio.
Notice now, I’ve got the gain turned down to just 3.7dB and the signals actually are quite clean off the background noise? It’s an excellent outcome, and you really should at least take the effort to do something *this* simple to improve your reception.
Testing it all out
Seeing that my Chromebook was otherwise occupied, I grabbed my old Dell Inspiron 640m running Lubuntu 13.10 to do the experiments.
Grabbing the latest rtl_sdr and building it was no hassle, as was building the source code by instruction. One thing you will have to do is sudo rmmod dvb_usb_rtl28xxu at the prompt on tuner insertion, because there is a driver which will take over the USB tuner and prevent rtl_sdr from being able to use it at all.
Even stupidly leaving the gain maxed out at 49.6dB, it decoded reliably every six seconds despite picking up a whole lot of noise from the digital circuitry on the tuner itself. I’m sure the decoding reliability would have only improved with the gain going down as it would have improved the signal to noise ratio. The frequency was changed to 433.51Mhz to account for the crystal differences.
This is a solid showing for the code – I haven’t touched it, and it runs on virtually everything. CPU utilization was much healthier running on a beefy thing such as a laptop.
This development has just suddenly lead to a thought. As many people are concerned about the advent of wireless smart meters, and the ability of unauthorized people to access the data (and the ability of authorized people to misuse the data), owning and operating a wireless energy monitor like this is actively broadcasting, without protections of any sort, your energy usage to your immediate surroundings!
In fact, running a decoder system like this requires no interaction with an existing system – it does not need to be paired at all. Thus, a system like this could inadvertently reveal when you are home, when you are not, what you are using at a given time. The paranoid would immediately consider this a giant risk, however, I think the risk is still rather small.
This system has no ability to switch my appliances on or off, burn down the house, etc.
But I suppose one should be aware that with a high gain antenna, it may be possible to receive and decode usage figures from people’s transmitters more than the intended 10-100m range.
It’s best not to just take what you are given and plug it in and hope for the best. I came to the wrong conclusion because I just blindly followed the instructions. Further investigations show that decoding performance can be improved by proper selection of centre demodulation frequency (duh) and improving the antenna itself (double duh). R820T owners rejoice!