Signals – SYMON MUTHEMBA http://symonmk.com Let's Get Technical Mon, 28 May 2018 19:46:35 +0000 en-GB hourly 1 https://wordpress.org/?v=4.9.6 https://i1.wp.com/symonmk.com/wp-content/uploads/2018/01/symon-3.png?fit=32%2C32 Signals – SYMON MUTHEMBA http://symonmk.com 32 32 141419617 Off-Grid Communications For The Masses: Smart Metering http://symonmk.com/off-grid-communications-for-the-masses-smart-metering/ http://symonmk.com/off-grid-communications-for-the-masses-smart-metering/#comments Mon, 30 Apr 2018 20:50:28 +0000 http://symonmk.com/?p=1047 In East Africa, a large percent of the population still does not have access to electrical energy and its benefits. To address this, several companies have developed micro-grids to provide AC power to rural East Africa. In order to sustain these grids, a remote, robust communication system has to be developed for purposes of metering and billing. In this post, I propose several efficient designs of a communication system that could be used to monitor and manage off-grid customers. Specifically, it proposes the technologies that can be used, the hardware and software implementation of such as system and how it can make business sense addressing equipment and operation costs.   This post proposes a communications system that tackles the stated situation within the boundaries of the limitations set by a real-world scenario, i.e. budget, energy supply and manpower. A comprehensive approach to systems design is valuable to ensure the sustainability of such a project. Sections in this post will cover the hardware, communications channels and protocols, remote monitoring systems and software that can be used to solve the stated design issue. The goal is to provide micro-grid providers with a trustworthy system for off-grid power management as well as help the locals with a solution that sufficiently caters to their needs.   Hardware Implementation A reliable remote metering system has to have a few basic characteristics: A smart metering system that connects every household, enabling 2-way data transfer between the customer and utility provider A network technology to enable the 2-way communication (fixed wired or wireless) A software system that actively manages the billing system and analyses usage data   With a system defined as in Figure 1, we can start to see how we can bring together the hardware components.   Smart Metering Smart meters are already in the market, such as Hexing Electric HXE 110-KP single phase prepayment smart meter and ZTE ZX E211, Figure 2, single phase prepayment smart meter. These meters meet Standard Transfer Specification (STS) standards and are fit for our application. The ZX E211 is the preferred choice here as its supports a variety of communication protocols (RS485, M-BUS, ZIGBEE, RF-MESH, PLC and GPRS). We will see how these communication protocols will be used in this post. ZTE ZX E211 LoRa based meter is particularly useful in long distance communication and allows us to adjust several parameters such as the transmission rate and frequency. The main feature is its low power consumption with a transmit current of less than 90mA@ 17dBm, receive current less than 13mA and standby current less than 0.7 uA. Since data communication may occur only few times a day, a majority of the consumption will be the standby current. Depending on the data that is provided by this meter, or a comparable one in the market, we may choose to consider meters that do not conform to STS standards. This may help us with communication protocols unavailable to us but may limit us in scaling and future upgrades with the national grid. Fabrication of a communications device alongside the meter may be required to send more usage statistics and deliver the desired data. This can be used for analysis to improve the overall system. This can be covered in a future post with AVR, PIC or FPGA as the processing IC in our DIY smart meter.   Communication System This post will discuss two concepts of a smart meter communications in a rural area based on two assumptions: Location size – Are the residents physically close to each other or spread out? Terrain – Is the area flat or hilly? Dense vegetation cover or dry grassland? To meet the requirements of the location, I propose two systems that can be established. They are the RF-MESH network and RF-STAR network. Both networks rely on wireless channels to carry data.   RF-MESH Network This type of network allows for data transmission via other wireless devices via a mesh (chain) network using a low power transceiver radio. This network is suitable for close-knit residential areas with few obstacles and is cheap to implement and scale. The architecture consists of low power transceiver radios per every meter box and data concentrators as in Figure 3. A proposed transceiver is the Silicon Labs Si4463 chip that facilitates the RF communications link. This is a transceiver I’ve worked with before on a previous project. Schematics of the full transceiver system is covered here. It is a low power transceiver with up to 20dBm (100mW) transmitting output power and a receiving sensitivity of -117dBm. Its wireless frequency band is 433.4 – 473.0MHz, and up to 100 channels can be set up with a channel stepping of 400 kHz. A serial port baud rate of 2400bps allows for a baud rate in air of 5000bps and a wireless receiving sensitivity of -117dBm. This gives an operating range of 1000m at clear line of sight between modules under ideal conditions. A concentrator can then be installed somewhere central in the village to aggregate the data of multiple smart meters and one concentrator may support hundreds of smart meters. This system is immune to sudden channel blocking as communication can flow using alternative paths. The DRF1110N20-C concentrator works well with DRF1110N20-N network nodes on a sub 1GHz channel. The concentrator can then upload the received data to micro-grid databases at different times of the day depending on the availability of the data network.   RF-STAR Network This network type is of a point to multi-point (PtMP) configuration. This communication system is admittedly more expensive than the RF-MESH network but is suitable in hilly terrains with thick vegetation and obstacles. The architecture consists of high-power radio transceivers with a line-of-sight to an omni-directional antenna radio as in Figure 4.   To implement this system, a 2.4GHz ISM channel may be used. A clear line of sight from a transmitter antenna to the receiver should be established, I recently talked about the art of obtaining strong microwave links. The smart meter information […]

The post Off-Grid Communications For The Masses: Smart Metering appeared first on SYMON MUTHEMBA.

]]>
http://symonmk.com/off-grid-communications-for-the-masses-smart-metering/feed/ 2 1047
Simplification in Design of Wireless Systems: 5 Useful Steps http://symonmk.com/design-wireless-systems/ http://symonmk.com/design-wireless-systems/#respond Fri, 20 Apr 2018 08:02:26 +0000 http://symonmk.com/?p=1005 As we all know, wireless is the preferred method of connectivity between most of our devices. This is going to take more precedence in coming years. The number of connected devices per person and the demand for fast, reliable content delivery within a network is rapidly increasing. Add that to the already ongoing craze of developing IoT devices and the super-scaling of server farms to support them. In my view, RF, DSP and embedded systems engineers will have a lot going on. This shift is largely dependent on the wireless systems we build. In this post, I try to figure out the best way forward in design of wireless systems.   The RF spectrum houses a number of wireless standards and medium in use today, these include WiFi, Bluetooth, FM broadcast, DVB-T, DAB, GSM, UTMS, LTE, WLAN, radar. The upcoming 5G standards are yet to be agreed upon as I illustrated here, but we can consider some technologies already in use today such as MIMO and MU-MIMO. Engineers responsible for the development of these systems are aware of the standards involved, however, they are required to understand a vast number of fields during the design phase that implementation takes a lot of time. This, of course, is uneconomical in the fast paced world we live in. Fortunately, engineers have figured out that the design process could be simplified into 5 major blocks: Modelling and simulation of digital, RF and antenna systems Optimization of design algorithms Automatic HDL and C code for hardware and software implementation Prototype design and testing with SDR hardware Iterative verification using model as a reference   Modelling and Simulation There are a number of softwares in use for modelling. In today’s case, we will consider MATLAB (free alternative GNU Octave). MATLAB is a renowned development kit for engineers and scientists. It is rare that you find something you can’t do with this software and its additional toolboxes. Must be why it costs a kidney, but it’s a good place to start. Simulink is an environment within MATLAB that you use to perform model-based design. I’ve developed  a simple communications link, Figure 1, that I can use as a basis of further developments. The model in Figure 1 is initiated by the following script:   The model above used the DSP System Toolbox and Communication System Toolbox. This model can be useful in the following ways: a starting point of system level design and verification a test-bench of design algorithms written in C language a point of generation of C or HDL code for use in DSP/FPGA implementation This model also allows us to simulate the results of our input and process variables to ensure we are getting the desired outputs.   Algorithm Design and Optimization Algorithms are the coded processes within a process block of a program. It defines what steps are in between the START and STOP operations of a process. Simple well known algorithms are flow control loops, error handling and on higher level languages we have object oriented programming (OOP). A while back I wrote an article touching on FFT algorithms. Many programming environments are set up with debugging features for your code. They analyze and give warnings of bad syntax and compile errors. This is particularly useful before running bad code into your hardware that may bring firmware failure. The best environments go a step further and allow you to optimize your code. You can set break points to see what happens when your program reaches a certain step, allowing you to tweak your variables accordingly. Very useful in precise calculations, characteristic of antenna design. A feature of MATLAB called Profiler allows you to run your algorithm while measuring its performance. It then generates a profile with details of the areas of your code that could use some improvement. This is based on the time the section took to run and how much of the processing resources it required.   HDL and C code Generation Hardware Description Language (HDL) and C/C++ language are pretty similar languages used in design and implementation of integrated circuits on supported microcontrollers, microprocessors or FPGA devices. They are the core of every embedded system, like in Figure 2, a Xilinx FPGA board. While developing complex wireless systems involving several devices, it is inefficient to separate simulated algorithms and IC programming. A software like MATLAB enables the automatic generation of HDL and C code using MATLAB Coder. To illustrate, we will generate C code from a Kalman filter algorithm. A Kalman filter is an optimal estimation algorithm used for parameter prediction. It is quite popular and used in the fields of vehicle navigation and guidance, computer vision  and wireless systems design. MathWorks provides a write up of the example in use. In that example kalmanfilter.m is my function file and ObjTrack.m is my algorithm which defines inputs, runs the Kalman filter and plots it in a graph, Figure 3.   Conversion involves using the MATLAB Coder. Add the function on the entry point file and define its inputs types after which go ahead and build the C code, Figure 4. The generated C code can be obtained from your MATLAB code directory.   SDR Hardware Prototypes Software-defined radios (SDRs) deserve a post of their own and thus will be briefly covered here. Basically these are computers whose components, traditionally implemented in hardware, are implemented through software. This means that the filters,  amplifiers, modulators/demodulators are implemented using programming languages. In the previous section we discussed how to perform code generation. What SDRs offer is the flexibility to test and implement wireless designs and architecture with the provision to add more features in future. SDRs are used in conjunction with FPGA, GPP (general purpose processors), DSP or ASIC (application specific ICs) to implement various wireless architectures. It is a low cost method that is becoming increasingly popular in wireless systems design.   Verification Finally, the system is rigorously verified using simulated and on-field test parameters to ensure the best product […]

The post Simplification in Design of Wireless Systems: 5 Useful Steps appeared first on SYMON MUTHEMBA.

]]>
http://symonmk.com/design-wireless-systems/feed/ 0 1005
How Computers ‘See’ and Add Value to Your Media: An Intro to Computer Vision http://symonmk.com/intro-computer-vision/ http://symonmk.com/intro-computer-vision/#respond Tue, 10 Apr 2018 18:00:37 +0000 http://symonmk.com/?p=985 This decade has been defined by advancement in data based technologies and learning algorithms. Terms like AI and automation have been used extensively to explain current trends in just about all industries. They have also been used to spark debates over fears of massive unemployment and increased consumerism that these technologies may bring. It is of much importance that you, yes YOU, the reader, to check and understand how these technologies may transform your way of life in years to come. One you may or may not have heard of is Computer Vision, that is likely to transform my current area of work and in this post I look at what this means for you and I.   Okay, Fancy Term, But What Is It? Computer vision (CV) is a branch of computer science that deals with enabling computers to process digital visual data and perform certain computations to make decisions based on the data. In simpler terms the computers can see and respond to images and videos provided to them, live or recorded, with a high level of accuracy and understanding. Image processing algorithms are at the heart of this to analyse images and videos (videos are just images when taken frame by frame). However, computers can see more than just images of bananas, image processing algorithms can be used for thermal (infrared) imaging, medical (x-ray and CT) scanning, satellite imaging and other forms humans can’t detect. CV has proven incredibly important to some of the most talked about companies in the world. Tesla is using CV to control their driver-less cars while Google Photos has already categorized my photos in terms of people and places. These are just some ways CV is being used but the possibilities are endless.   Is Computer Vision Important? A study by Cisco revealed that by 2019, 80% of all Internet traffic will be video. We are a year away from that reality. Hmm. Maybe I should be making videos instead of bloggi… I digress. According to that study, there is an ongoing explosion of video content. Without CV algorithms most of the data that can be generated from the content will be wasted. In the media and entertainment world, useful information derived from videos can be used to more efficiently position and time adverts with sufficient knowledge that they will be seen/interacted with. Check out how TheTake is doing it in a very interesting way. CV has been used extensively in sports broadcast especially with tracking fast moving objects and object identification. Post-match analysis of sports videos, Figure 1, gave rise to richer sports commentaries, very useful for coaches and fans.   On a hardware level, CV is useful in the automotive industry (as discussed earlier with Tesla), manufacturing where quality assurance can be aided with CV; check out Sight Machine a company that uses CV and other AI techniques to improve manufacturing, farming industry, for this check out Prospera, to detect crop yields and many more. They may work hand in hand with IoT devices to deliver decisions over the Internet. One limitation with CV applications is poor quality images. However, we are seeing how that keeps changing year after year with cameras capable of taking higher resolution images, at a higher dynamic range. They even include processors that perform image stabilization, noise reduction and defect removal all while being smaller and more robust.   What Really Happens? So far I’ve mentioned image processing and algorithms, I’ll explain further. An image fed to a computer can be broken down to individual pixels. Each pixel is defined by a color or the chromaticity of the pixel. There could be several ways to represent color but a popular scheme is the RGB value that defines intensity of red, green or blue color as an integer between 0 and 255 e.g. (201, 250, 100) represents tennis ball green. To perform image analysis, you tell the computer what RGB value requires tracking. In our example, you feed it the RGB value of tennis ball green and images of a tennis court with an ongoing match. The scene is analysed pixel by pixel until it lands on the pixel whose RGB value has the lowest difference to the one provided. That covers the basics but in reality, things need to be more efficient than that. Analysis is better performed using kernels which analyse a patch of pixels and characterize them. Kernels can then be combined to characterize a combination of features  and with this complex images can be detected. Convolution algorithms can be further added to aid in detection, where a series inputs from an image can carry a specific weight (by multiplying the input value with the weight) and then added together. This is used to generate useful kernels to further analyse the images. Such is a convolutional neural network (CNN), which learns to generate useful kernels.   Beyond this, the CNNs may perform image processing in layers. Layer 1 may detect lines (1D), layer 2 may detect shapes (2D), layer 3 may detect shadows (3D) and so on. Usually, the greater the number of layers used the better the computer’s ability to accurately identify objects and make meaningful decisions. The use of a multitude of layers, as in Figure 2, gave rise to the term deep learning algorithms. This goes even further with stuff like Markov models coming into place to provide more accurate results.   Where CV is Best Applied CV is likely to revolutionize several fields and industries. We will begin to see smarter devices and robots using imaging to perform a variety of tasks. Drones equipped with cameras to give reports of drought and forest cover and immediately establish optimal irrigation schemes. CV experts and doctors could start collecting all imaging records for faster and more accurate diagnosis. The results of applied computer vision could massively reduce costs of items as manufacturing processes become more streamlined. The entertainment sector stands to make massive profits from applied CV. Imagine being able to read your audiences reactions and quickly adjust […]

The post How Computers ‘See’ and Add Value to Your Media: An Intro to Computer Vision appeared first on SYMON MUTHEMBA.

]]>
http://symonmk.com/intro-computer-vision/feed/ 0 985
Easy Ways to Simulate the Strongest Microwave Links http://symonmk.com/strongest-microwave-links/ http://symonmk.com/strongest-microwave-links/#respond Tue, 27 Feb 2018 20:47:15 +0000 http://symonmk.com/?p=848 Having recently been involved with several microwave link installations and servicing, I have gathered several best practices to installing digital IP microwaves and obtaining the best PtP (point-to-point) links from them. A PtP link is simply a directional link between microwave antennas with a clear line of sight. During installation, several prior calculations have to be made to ensure the best possible link as well as mitigate the chances of link failure. With the recent development of online geographical maps, these calculations have been even more simplified and in this post we will dive into using these tools. In my field of work we apply microwave links in providing distribution links and contribution links. Distribution links are mainly the links between a studio and a transmitting site, such a link is mentioned here. Contribution links are the links between an OB (outside broadcast) site and their studio.   Path Profiles A path profile refers to a straight line cross-section of two points on the earth’s surface. This can be used in obtaining the clearance characteristics of two points. From secondary school geography, for those who can remember, this was obtained by drawing a straight line between two points on a topographical map (scale of 1:50000 for example), obtaining the altitudes of all the points on the line then drawing an altitude versus distance graph. Joining the points on the graph smoothly gives you the path profile required for your point to point connection. Beyond this an earth bulge calculation has to be performed on points on your profile close to the straight line of sight. This is to compensate for atmospheric refraction which causes variations of the atmospheric refractive index caused by the effective earth radius , these variations lead to bending of microwave rays. A value of 4/3 is usually taken to denote the change in atmospheric index with height due to the earth’s radius which may cause the microwave ray to bend downwards. Other changes in atmospheric refraction may cause the value of to drop to about 2/3 which may cause an upward bending of the ray. The earth bulge constraint helps you better plan the height to set your antennas with the calculation as:     where and are distances between the particular point on the path and the each end on the path in kilometers.   After this we obtain the radius of the first Fresnel Zone which to put in technically is the locus of all points surrounding a radio beam from which reflected rays would have a path length one half-wavelength greater than the direct ray but may be simply understood as the region where a direct line of sight is achieved with the strongest signal link. The first Fresnel zone may be calculated as follows:     where where and are the distances to each end of the path in kilometers, and F is the frequency in gigahertz.   A minimum 60% clearance criterion is also required. This is normally a clearance of 60% (0.6) over obstacles of the first Fresnel zone calculated above with equal to 2/3 . With this, we can now use the manufacturer’s data of our radios and antennas to determine the final parameters in our planning. These include: Antenna gain Branching losses at both ends of the link Feeder equipment losses   Alternatively… With all these in place you should be fine to gather your equipment and start right away. However, the process discussed may be too lengthy and prone to errors depending on your math skills. At the time of writing this post technology has made more complicated areas in life such as dating as simple as swiping right, so why not this too? Thankfully, a lot of manufacturers offer proprietary simulators to determine a clear PtP or even PtMP (point-to-multipoint) link from the comfort of your computer. One such manufacturer is Ubiquiti with their popular link simulator. Using Ubiquiti’s tool is fairly simple. I did a simulation of two points, Figure 1, between Chiromo area in Nairobi area and Limuru where transmitter sites for broadcast are usually found. To start off, obtain the coordinates of the two points. During site surveys I find it very useful using My GPS Coordinates which is a simple, free Android app which gives you your current coordinates. Having the coordinates of your access point (AP) and station (STA) we can now simulate link of the two sites. Here I have -1.275106,36.807504 as my access point and -1.127295,36.635714 as my station. In a previous installation, we used the Rocket M5 radios operating at 5GHz frequency using RocketDish antennas with a gain of 34dBi. Antennas with gains of 30dBi would also have sufficed but the higher gain antennas gave us better signal strengths. This information is filled on the section in Figure 2 together with the channel width. At this point you leave it up to the simulator to perform the calculations that give you details on the signal strength. If one has not been achieved between the two points the simulator will state that the link is obstructed. This can be rectified by adjusting the antenna height (to as high as is reasonable) or changing the station area by moving the antenna position to get a clear LoS. If this is still not achievable, consider setting up repeater stations to navigate around the obstructions. With a clear line of sight the next step is to check if the signal strengths are satisfactory. The higher the better. According to my simulation, Figure 3, -90dBm is weak while -60dBm is good. Aim for the strongest link possible to counter effects of fog, precipitation and changes in atmospheric gases. The same steps defined can be used for PtMP links. Conclusion This is a helpful skill I gathered from setting up microwave links and troubleshooting when failure occurs. To ensure the highest levels of availability in a year, the strongest possible links should be set at your client site. Understanding the important parameters helps you in prior planning before installation and the performance […]

The post Easy Ways to Simulate the Strongest Microwave Links appeared first on SYMON MUTHEMBA.

]]>
http://symonmk.com/strongest-microwave-links/feed/ 0 848
Getting It Right With Audio Quality And Consistency http://symonmk.com/audio-quality-consistency/ http://symonmk.com/audio-quality-consistency/#respond Wed, 14 Feb 2018 07:00:09 +0000 http://symonmk.com/?p=826 Of the twenty one senses that we have, hearing plays a large role in how we experience the world we live in. When it comes to video, humans tend to accept the limitations of the current generation technology. Remember when monochrome TVs were widely adopted? Mom remembers. However, we always expect clean, crisp audio from our entertainment/news platforms. Bad sounding media is often unforgivable as compared to poor image quality. Herein lies the need to ensure consistent great quality audio reaches your listeners. Let’s check the considerations audio engineers in your plant can contribute to the highest possible audio quality and consistency. Audio media goes through two major processes during its creation, audio production and audio processing. Production involves the activities and equipment used to capture or create sounds. This includes audio design, mixing, editing, dubbing, applying various sound effects and balancing sources. Audio production is beyond the scope of this topic but will be referenced to as it comes before the audio processing chain. The audio processing chain refers to the activities and processes to give your audio a particular desired sound. That means that the sound from your production site has a particular mood and feel. This is achieved by the technical manipulation of the audio signal. Processing is based how you want your audio to impact your listening audience, an interesting scientific study called psychoacoustics.   Quality Audio engineers need to ensure the best signal quality of the audio that is produced from the production sites (they may be audio labs, recording studios, FM and TV studio etc) and this process starts from the infrastructure used to capture and transmit the sound. Isolation and acoustic treatment was covered in a previous post, beyond that we have the transmitting elements; cables and connectors. XLR cables should be fabricated well as and the audio cable should be of high quality. I’ve had instances of cables I bought having rusted sleeves! So be wary of those Luthuli Avenue stores and pick the right brand. High quality connectors should be used for the best sounding audio. The popular Neutrik connectors should suffice. High quality microphones for better sound capture should be used. However, great sounding audio doesn’t come cheap. Testing your connections can be quite easy as long as you do not have complicated cable paths. Also avoid running them alongside power cables to minimize interference. Test for any shorting of the cable elements (hot, cold and sleeve/ground) using a continuity test and resistance along the sleeve should be as low as possible. The rusted sleeve I described earlier had high resistance. Testing for audio in a large plant after all cables are terminated may be a grueling task therefore it is best to plan early on testing while terminating the connections to ensure you are satisfied at each step. Now check your equipment for any distortion from clipping before getting the signal to the audio processor, in an FM plant the standard equipment before the processor are: Microphone preamps Console summing amplifiers Communication devices such as phone systems, remote links Analog-to-digital converters Stereo profanity delay Computer sound cards   Consistency Audio quality MUST be ensured before processing as manipulating bad audio to give you desired results is a worthless effort. In the audio processing chain the sound engineer can perform a set of operations to fine-tune the signal. For best results use linear uncompressed audio formats such as WAV as compared to compressed formats like mp3. A digital sound processor may be added to your chain to perform the following operations: Multiband compression Stereo expansion Equalization Automatic gain control Multiband compression is a form of dynamic range compression. This range compression is performed to either amplify low levels or reduce higher levels. It is important, say in traffic, that the low levels does not get lost in the background, and high levels aren’t too uncomfortable for listeners. In the automobile, dynamic range cannot exceed 20dB without causing problems A multiband compressor checks the audio being fed to it and adds compression to only the parts of the signal that need compressing. This can allow the engineers to increase the loudness levels without much fear of distortion. Stereo expansion/widening is a technique used to expand your stereo image. Stereo image is the perceived spatial location of the sound source. Thus stereo expansion increases the perceived width of your audio. Panning is the most important technique when it comes to stereo expansion as it allows you to place instruments or vocals to as wide an area as desired. An extreme version of this method is binaural panning that emulates human hearing by allowing you to position the direction of a signal source so your ears perceive the sound as coming from either in front, behind, above, below, and to the left or right of the listening position when using a stereo output. Get a good set of headphones and enjoy this video. Equalization (EQ) is simply manipulating the different frequency components in your signal by use of an equalizer. It is important to note that for the reasons discussed earlier, dynamic range compression should come before EQ for the best perceived effect, otherwise it will be difficult to establish the effect of EQ. The equalizer is a circuit or DSP (digital signal processing) plug-in with linear filters. EQ is the way to give your audio a particular mood depending on how you play around with the low, mid and high frequencies. Automatic gain control is usually the final step to the output. In electronics, it is a closed-loop circuit that provides a feedback loops allowing for a controlled output despite variable input amplitude. This is used to ensure consistent volume of the audio signal. From Figure 2, the signal to be gain controlled goes to a diode and capacitor, which produce a peak-following DC voltage.   Again, the equipment following the processing chain in an FM plant can also affect the quality of the audio. They should also be checked for proper […]

The post Getting It Right With Audio Quality And Consistency appeared first on SYMON MUTHEMBA.

]]>
http://symonmk.com/audio-quality-consistency/feed/ 0 826