Category Archives: Computing

What is Edge Computing?

Many IT professionals spend most of their careers within the safe and controlled environments of enterprise data centers. However, managing equipment in the field is a different ball-game altogether.

Pandemics such as the COVID-19 are increasingly transforming the world. The emerging ecosystem is confronting and challenging this transformation. Among this mayhem, edge computing is entering as a key transition phase, with the massive shift towards home-based work. Along with the generation of new opportunities for distributing computing, key players are deploying increasing numbers of edge data centers for navigating the sharp economic downturn.

The major benefit of edge computing is it acts on data at the source. This distributed computing framework works by bringing enterprise applications closer to sensors acting as data sources within the IoT system and connecting them with local edge servers and cloud storage systems. Edge computing can deliver strong business benefits with its better bandwidth availability, improved response times, and faster insights.

Edge computing has the potential to enable new services and technologies with its low-latency wireless connectivity. This could transform global business and society. According to some technologists, edge computing can bring in a new era of powerful mobile devices with no limit on their ability to compute power and data.

Consultants and futurists are projecting a growth of up to US$4.1 trillion for the edge economy by 2030. Linux Foundation, in their report Edge 2020 claim edge investment will take wing after 2024, and the power footprint of the deployed edge IT and data center facilities reaching 102, 000 MW by 2028. They expect the annual capital expenditures to reach US$146 billion by then.

In the technology world, however, there are divided opinions regarding the short-term prospects of edge computing. Although there is no doubt about the usefulness of edge computing, people are skeptical about the time frame for edge computing to become profitable. Therefore, starting with 2020, investors and end-users are looking intently at the economics of edge computing and focusing more on its near-term cost-benefits rather than on its long-term potentials.

There is a huge opportunity in edge data centers, as edge computing plays out over several years, with long deployment horizons and gradual adoption of technologies boosting the market. However, executives do not expect the revolution to go through cheaply, with the build-out of edge computing pressurizing the economics of digital infrastructure. This may create repeatable form factors leading to more affordable deployments. Experts are confident that most edge data facilities will be highly automated, remotely managed, and require no human intervention.

At the present, it is difficult to say which edge projects will succeed. With product segmentation and a fluid ecosystem, even promising ventures can struggle as they try to locate profitable niches. While investors are wary of speculative projects, it is reasonable to expect well-funded platform builders and stronger incumbents will acquire promising edge players, especially those running short of funding.

Tower operators are also influencing the competitive landscape. Their massive real estate holdings and financial strengths are positioning the tower operators as potentially important players in the edge computing ecosystem.

What are RTUs – Remote Terminal Units?

Nowadays, small computers make up remote terminal units or RTUs and SCADA units. Users program controller algorithms into these units, allowing them to control sensors and actuators. Likewise, they can program algorithms for logic solvers, power factor calculators, flow totalizers, and many more, according to actual requirements in the field.

Present RTUs are powerful computers able to solve complex algorithms or mathematical formula describing external functions. Sensing devices or sensors gather data from the field, sending the signals back to the RTU. By solving the algorithms in it using the input signals, the RTU then sends out control instructions to valves or other control actuators. As scan periods in RTUs are very small, the entire activity happens very fast, hardly taking a few milliseconds, with the RTU repeating the process.

Regulatory agencies certifying RTUs prefer use of dedicated hardware for solving certain safety related functions such as toxic gas concentration and smoke detection. Therefore, they make sure of the reliability of detection for safety related functions.

The RTU operates in a closed system. Sensors measure the process variables, while actuators adjust the process parameters and controllers solve algorithms for controlling the actuators in response to the measured variables. The entire system works together based on wiring or some form of communication protocol. This way, the RTU enables the field processes near it to operate according to design.

Before the controller in the RTU can solve the algorithm, it has to receive an input from the field sensor. This requires a defined form of communication between the RTU and the various sensors in the field. Likewise, after solving the algorithm, the RTU has to communicate with the different actuators in the field.

In practice, sensors usually feed into a master terminal unit or MTU that conditions their input, changing it to the binary form from the analog form, if necessary. This is because sensors may be analog or digital types. For instance, a switch acting as a sensor can send information about its state using a digital one or +5 V when it is open and a digital zero or 0 V when it is closed. However, a temperature sensor has to send an analog signal or a continuously varying voltage representing the current temperature.

The MTU uses analog to digital converters to convert analog signals from the sensors to a digital form. All communication between the MTU and the RTU is digital in nature, and a clock signal synchronizes the communication.

The industry uses RTUs as multipurpose devices for remote monitoring and control of various devices and systems, mostly for automation. Although industrial RTUs perform similar function as programmable logic circuits or PLCs do, the former operates at a higher level as RTUs are basically self-contained computer units, containing a processor and memory for storage. Therefore, the industry often uses RTUs as intelligent controllers or master controller units for controlling devices that automate a process. This process can be a part of an assembly line.

By monitoring the analog and digital parameters from the field through sensors and connected devices, RTUs can control them and send feedback to the central monitoring station for industries dealing with power, water, oil, and similar distribution.

Raspberry Pi to Linux Desktop

You may have bought a new Single Board Computer (SBC), and by any chance, it is the ubiquitous Raspberry Pi (RBPi). You have probably had scores of projects lined up to try on the new RBPi, and you have enjoyed countless hours of fun and excitement on your SBC. After having exhausted all the listed projects, you are searching for newer projects to try on. Instead of allowing the RBPi to remain idle in a corner, why not turn it into a Linux desktop? At least, until another overwhelming project turns up.

An innovative set of accessories converts the RBPi into a fully featured Linux-based desktop computer. Everything is housed within an elegant enclosure. The new Pi Desktop, as the kit is called, comes from the largest manufacturer of the RBPi, Premier Farnell. The kit contains an add-on board with an mSATA interface along with an intelligent power controller with a real-time clock and battery. A USB adapter and a heat sink are also included within a box, along with spacers and screws.

Combining the RBPi with the Pi Desktop offers the user almost all functionalities one expects from a standard personal computer. You only have to purchase the solid-state drive and the RBPi Camera separately to complete the desktop computer, which has Bluetooth, Wi-Fi, and a power switch.

According to Premier Farnell, the system is highly robust when you use an SSD. Additionally, with the RBPi booting directly from an SSD, it ensures a faster startup.

Although several projects are available that transform the RBPi into a desktop, you should not be expecting the same level of performance from the RBPi as you would get from a high-end laptop. However, if you are willing to make a few compromises, it is possible to get quite some work done on a desktop powered with the RBPi.

Actually, the kit turns the RBPi into a stylish desktop computer with an elegant and simple solution within minutes. Unlike most other kits, the Pi Desktop eliminates a complex bundle of wires, and does not compromise on the choice of peripherals. You connect the display directly to the HDMI interface.

The added SSD enhances the capabilities of the RBPi. Apart from extending the memory capacity up to 1 TB, the RBPi can directly boot up from the SSD instead of the SD card. This leads to a pleasant surprise for the user, as the startup is much faster. Another feature is the built-in power switch, which allows the user to disconnect power from the RBPi, without having to disconnect it from the safe and intelligent power controller. You can simply turn the power off or on as you would on a laptop or desktop.

The stylish enclosure holds the add-on board containing the mSATA interface and has ample space to include the SDD. As the RBPi lacks an RTC, the included RTC in the kit takes care of the date and time on the display. The battery backup on the RTC keeps it running even when power to the kit has been turned off. There is also a heat sink to remove heat built-up within the enclosure.

Cloud Storage and Alternatives

Ordinarily, every computer has some local memory storage capacity. Apart from the Random Access Memory or RAM, computers have either a magnetic hard disk drive (HDD) or a solid-state disk (SSD) to store programs and data even when power is shut off—RAM cannot hold information without power. The disk drive primarily stores the Operating System that runs the computer, other application programs, and the data these programs generate. Typically, such memory is limited and tied to a specific computer, meaning other computers cannot share it.

A user has two choices for adding more memory to a computer—he/she can either buy a bigger drive or add to the existing one, or he can use cloud storage. Various service providers offer remote memory storage, and the user has to pay a nominal rental amount for using a specific amount of cloud memory.

There are several advantages of using such remote memory. Most cloud storage services offer desktop folders where users can drag and drop files from their local storage to the cloud and vice versa. As accessing the cloud services requires Internet connection, the user can avail the cloud facilities from anywhere, while sharing it between several computers and users.

The user can use the cloud service as a back up for storing a second copy of their important information. In the event an emergency strikes and the user loses all or part of their data on their computer, accessing the cloud storage through the Internet can help to restore the stored information on the cloud. Therefore, cloud storage can act as a disaster recovery mechanism.

Compared to local memory storage, cloud services are much cheaper. Therefore, users can reduce their annual operating costs by using cloud services. Additionally, the user saves on power expenses, as cloud storage does not require the user to supply power that local memory storage would need.

However, cloud storage has its disadvantages. Dragging and dropping files to and from the cloud storage takes finite time on the Internet. This is because cloud storage services usually limit the bandwidth the user can avail for a specific rental charge. Power interruptions and or bad Internet connection during the transfer process can lead to corruption of data. Moreover, the user cannot access his/her data on the cloud storage unless there is an Internet connection available.

Storing data remotely also brings up the concerns of safety and privacy. As the remote memory is likely to be shared by other organizations, there is a possibility of data comingling.

Therefore, people prefer using private cloud services, which are more expensive, rather than using cheaper public cloud services. Private cloud services may also offer alternative payment plans, and these may be more convenient for users. Usually, the private cloud services have better software for running their services, and offer users greater confidence.

Another option private cloud services often offer is of encrypting the stored data. That means only the actual user can make use of their data, and others, even if they can access it, will see only garbage.

What is a wireless router?

Most of the electronic gadgets we use today are wireless. When they have to connect to the Internet, they do so through a device called a router, which may be a wired or a wireless one. Although wired routers were very common a few years back, wireless routers have overtaken them.

Routers, as their name suggests, direct a stream of data from one point to another or to multiple points. Usually, the source of data is the transmitting tower belonging to the broadband dealer. The connection from the tower to the router may be through a cable, a wire, or wireless. To redirect the traffic, the router may have a network of multiple Ethernet ports to which users may connect their PCs, or, as in the latest versions, it may transmit the data wirelessly. The only wire a truly wireless router will probably have is a cable to charge its internal battery.

Technically speaking, the wireless router is actually a two-way radio, receiving the signals from the tower and retransmitting them for other devices to receive. A SIM card inside the router identifies the device to the broadband company, helping it to keep track of the routers statistics. Modern wireless routers follow international wireless communication standards—the 802.11n being the latest, although there are several of the type 802.11b/g/n, meaning they conform to the earlier standards as well. Another differentiation between various routers is their operating speed, and the band on which they operate.

The international wireless communication standards define the speed at which routers operate. For instance, wireless routers of the type 802.11b are the slowest, with speeds reaching up to 11 Mbps. While those with the g suffix can deliver a maximum speed of 54 Mbps, those based on the 802.11n standard are the fastest, reaching up to 300 Mbps. However, a router can deliver data only as fast as the Internet connection allows. Therefore, even if it has a rating of n or 300 Mbps, it will perform at speeds of 100 Mbps at the most. Nonetheless, a fast wireless router can increase the speed of your network, and this allows PCs to interact faster, making them more productive.

International standards allow wireless communication on two bands—2.4 GHz and 5.0 GHz. Most wireless routers based on the 802.11b, g, and n standards use the 2.4 GHz band. These are the single band routers. However, the 802.11n standard allows wireless devices to operate on the 2.4 GHz or the 5.0 GHz band also. These are the dual-band routers, which can transmit in either of the two bands via a selection switch, or in some devices, they can operate in both frequencies at the same time.

A newer standard, 802.11a, allows wireless networking on the 5.0 GHz band, while also transmitting on the 2.4 GHz band used by the 802.11b, g, and n standards. These are also dual band wireless routers with two different types of radios that support connections on both 2.4 GHz and 5.0 GHz bands. The 5.0 GHz band offers better performance, lower interference, and more coverage.

What happens when you turn a computer on?

Working on a computer is so easy nowadays that we find even children handling them expertly. However, several things start to happen when we turn on the power to a computer, before it can present the nice user-friendly graphical user interface (GUI) screen that we call the desktop. In a UNIX-like operating system, the computer goes through a process of booting, BIOS, Master Boot Record, Bootstrap Loading, grub, init, before reaching the operating level.

Booting

As soon as you switch on the computer, the motherboard initializes its own firmware to get the CPU running. Some registers, such as the Instruction Pointer of the CPU, have permanent values that point to a fixed memory location in a read only memory (ROM) containing the basic input output system (BIOS) program. The CPU begins executing the BIOS from the ROM.

BIOS

The BIOS program has several important functions, which begin with the power on self-test (POST) to ensure all the components present in the system are functioning properly. POST indicates any malfunction in the form of audible beeps. You have to refer to the Beep Codes of the motherboard to decipher them. If the computer passes the test for the video card, it displays the manufacturer’s logo on its screen.

After checking, BIOS initializes the various hardware devices. This allows them to operate without conflicts. Most BIOSs follow the ACPI create tables for initializing the devices in the computer.

In the next stage, the BIOS looks for an Operating System to load. The search sequence follows an order predefined by the manufacturer in the BIOS settings. However, the user can change this Boot Order to alter the actual search. In general, the search order starts with the hard disk, CD-ROMs, and thumb drives. If the BIOS does not find a suitable operating system, it displays an error. Otherwise, it reads the master boot record (MBR) to know where the operating system is located.

Master Boot Record

In most cases, the operating system resides in the hard disk. The first sector of the hard disk is the master boot record (MBR), and its structure is independent of the operating system. It consists of a special program, the bootstrap loader, and a partition table. The partition table is actually a list of all the partitions in the hard disk and their file system types. The bootstrap loader contains the code to start loading the operating system. Complex operating systems such as Linux use the grand unified boot loader (GRUB), which allows selecting of one of the several operating systems present on the hard disk. Booting an operating system using GRUB is a two-stage process.

GRUB

Stage one of the GRUB is a tiny program and its only task is to call stage two, which contains the main code for loading the Linux Kernel and the file system into the RAM. The Kernel is the core component of the operating system, remains in the RAM throughout the session, and controls all aspects of the system through its drivers and modules. The last step of the kernel boot sequence is the init, which determines the initial run-level of the system. Unless otherwise instructed, it brings the computer to the graphical user interface (GUI) for the user to interact.

Connect with a New Type of Li-Fi

Many of us are stuck with slow Wi-Fi, and eagerly waiting for light-based communications to be commercialized, as Li-Fi promises to be more than 100 times faster than the Wi-Fi connections we use today.

As advertised so far, most Li-Fi systems depend on the LED bulb to transmit data using visible light. However, this implies limitations on the technology being applied to systems working outside the lab. Therefore, researchers are now using a different type of Li-Fi using infrared light instead. In early testing, this new technology has already crossed speeds of 40 gigabits per second.

According to the Li-Fi technology, a communication system first invented in 2011, data is transmitted via high-speed flickering of the LED light. The flickering is fast enough to be imperceptible to the human eye. Although lab-based speeds of Li-Fi have reached 224 gbps, real-world testing reached only 1 gbps. As this is still higher than the Wi-Fi speeds achievable today, people were excited about getting Li-Fi in their homes and offices—after all, you need only an LED bulb. However, there are certain limitations with this scheme.

LED based Li-Fi presumes the bulb is always turned on for the technology to work—it will not work in the dark. Therefore, you cannot browse while in bed in the dark. Moreover, as in regular Wi-Fi, there is only one LED bulb to distribute the signal to different devices, implying the system will slow down as more devices connect to the LED bulb.

Joanne Oh, a PhD student from the Eindhoven University of Technology in the Netherlands, wants to fix these issues with the Li-Fi concept. The researcher proposes to use infrared light instead of the visible light from an LED bulb.

Using infrared light for communication is not new, but has not been very popular or commercialized because of the need for energy-intensive movable mirrors required to beam the infrared light. On the other hand, Oh proposes a simple passive antenna that uses no moving parts to send and receive data.
Rob Lefebvre, from Engadget, explains the new concept as requiring very little power, since there are no moving parts. According to Rob, the new concept may not be only marginally speedier than the current Wi-Fi setups, while providing interference-free connections, as envisaged.

For instance, experiments using the system in the Eindhoven University have already reached download speeds of over 42 gbps over distances of 2.5 meters. Compare this with the average connection speed most people see from their Wi-Fi, approximately 17.5 mbps, and the maximum the best Wi-Fi systems can deliver, around 300 mbps. These figures are around 2000 times and 100 times slower respectively.

The new Li-Fi system feeds rays of infrared light through an optical fiber to several light antennae mounted on the ceiling, which beam the wireless data downwards through gratings. This radiates the light rays in different direction depending on their wavelengths and angles. Therefore, no power or maintenance is necessary.

As each device connecting to the system gets its own ray of light to transfer data at a slightly different wavelength, the connection does not slow down, no matter how many computers or smartphones are connected to it simultaneously.

Python Libraries for Machine Learning

Machine learning helps with many practical applications, suitably augmented by deep learning and with extensions of the overall field of artificial intelligence. Many people, with the help of analytics and statistics, are busy navigating the vast universe of deep or machine learning, artificial intelligence, and big data. However, they do not really have to qualify as data scientists, as popular machine learning libraries in Python are available.

Machine learning is promoting deep learning and AI for all kinds of machine assists, including driverless cars, better prevention healthcare, and even better movie recommendations.

Theano

A machine-learning group at the Universite de Montreal developed and released Theano a decade ago. In the machine learning community, Theano is one of the most used mathematical compiler for CPUs and GPUs. A 2016 paper describes Theano as a “Python framework for fast computation of mathematical expressions,” and offers a thorough overview of the library.

According to the paper, development of several software packages build on the strengths of Theano, offering higher-level user interface, making them more suitable for specific goals. For instance, expressing training algorithms mathematically and evaluating the architecture of deep learning models using Theano became easier with the development of Keras and Lasagne.

Likewise, a probabilistic programming framework PyMC3, using Theano, derives expressions automatically for gradients. PyMC3 also generates C-codes for fast execution. That people have forked Theano over two-thousand times, it has almost 300 contributors on GitHub, and it garners more than 25,000 commits, is testimony to its popularity.

TensorFlow

Although a newcomer to the world of open source, TensorFlow is a library for numerical computing and uses data flow graphs. In its first year itself, TensorFlow has helped students, artists, engineers, researchers, and many others. According to the Google Developers Blog, TensorFlow has helped with preventing blindness in diabetes, early detection of skin cancer, language translation, and more.

TensorFlow has appeared several times in the most recent Open Source Yearbook. It has been included as a project in the list of top ten open source projects to watch in 2017. In a tour of Google’s 2016 open source releases, an article by Josh Simmons refers to Magenta, a TensorFlow based project.

According to Simmons, Magenta advances the technology in machine intelligence for music and art generation. It also helps build a collaborative community of coders, artists, and researchers dealing with machine learning. According to another researcher, Rachel Roumeliotis, she lists TensorFlow as a language for powering AI as a part of her roundup of Hot programming trends of 2016.

Anyone can learn more about TensorFlow by watching the live stream of recording from the TensorFlow Dev Summit 2017, or by reading the DZone series—TensorFlow on the Edge.

Scikit-Learn

Spotify engineers at okCupid use Scikit-Learn for recommending music, for helping evaluate and improve their matchmaking system, and for exploring phases of new product development at Birchbox. Scikit-Learn is built on Matplotlib, SciPy, and NumPy. It has 800 contributors on GitHub, and garners almost 22,000 commits.

The Scikit-Learn project website offers free tutorials, where one can read about using Scikit-Learn for machine learning. Alternately, they can watch the PyData Chicago 2016 talk given by Sebastian Raschka.

Raspberry Pi to Displace the Business PC

For a business establishment, maintaining PCs for each of their several hundred employees can be an expensive proposition. It is much simpler and cheaper to have a centralized workstation with several thin clients connecting to it. The ubiquitous single board computer, the Raspberry Pi (RBPi) is a suitable component for use as such a thin client.

As the low cost of the Raspberry Pi makes it a very attractive proposition for use as a thin client computer, Citrix is offering an HDX Ready Pi to replace the regular desktop PC. They are coupling the RBPi with virtual desktops such as the Citrix XenDesktop and the XenApp virtual apps. The combination is an ideal replacement for the traditional desktop PC and its IT refresh cycle.

At the heart of the project are two thin client operating systems, ThinLinX and TLXOS, based on Raspbian, the default OS for the Raspberry Pi. These provide the image for the RBPi and include the client and management software. Citrix is making use of these to instill an HDX SoC Receiver SDK within the securely locked-down Linux OS and the SDK provides full device management for updating firmware, remote configuration, and DHCP, making the RBPi a completely plug-n-play device.

Available fully assembled and ready-to-order from Citrix partners ViewSonic and Micro Center, the HDX Ready Pi thin-clients come preloaded with all the necessary software, power supply, flash storage, VESA mount option, all packaged in a production case. Any IT administrator can deploy these thin-clients in a matter of seconds.

Apart from being just a cheap PC alternative, these RBPI thin-clients offer businesses several new business paradigms. For instance, businesses now need not pay a premium for security and management of all their PCs, and they can expand their number of users to cover the entire organization.

The Citrix HDX Ready Pi is easy to set up. As it is small, distribution is simplified and employees can connect it up to an available display and be productive in a matter of minutes. IT can configure the management software, recognize the HDX Ready Pi in the network, take control of it, and point it automatically to the correct Citrix Storefront server. The user can then run any instant virtual app with desktop access.

As the RBPi thin-clients have no hard disks to fail, there is also no data and time wasted in diagnosing device problems. This eliminates all desk-side support, as any issue can be solved simply by swapping the device.

The low cost of thin-clients also eliminates treating them as trackable financial assets. Businesses can rather consider the Citrix HDX Ready Pi as non-capitalized office expenses, providing a compelling situation to virtualize remote branch offices all over the world.

As there is no provision to store or cache corporate data, businesses can safely distribute the HDX Ready Pi among employees for occasionally working from home over Wi-Fi or for teleworking. Employees can take the device home and use it safely for remote access.

Although the Citrix HDX Ready Pi has a Kensington lock slot, its low cost makes physical security almost a non-issue. Moreover, as the device is purpose-built for Citrix, it can be safely used as a pervasive computing device in an office campus or in public spaces.

RX300 – The Windows 10 Thin Client with the Raspberry Pi

The Raspberry Pi (RBPi) has no hard disk, is stateless, and can work as a desktop terminal, which makes it an ideal candidate for use as a thin client. It connects to the data center for all its applications, sensitive data, memory, and runs a Remote Desktop Protocol such as the Windows Terminal Services.

That makes the RBPi a virtual desktop computing model, as it runs virtualization software, and accesses hard drives in the data center. Thin client computing has thin clients, software services, and backend hardware as its components.

Users can use thin clients as a replacement for a PC to access any virtual desktop or virtualized application. This is a cost-effective way to create a virtual desktop infrastructure. NComputing is using the RBPi as a thin client, named as RX300, to access the Windows 10 desktop.

A central machine runs the NComputing vSpace Pro 10 desktop virtualization software, and streams several Windows desktops, including Windows 10. The virtualization software allows the centrally managed Windows desktop to be run on hundreds of RX300 clients.

According to NComputing, the vCAST streaming technology it uses for full-screen playback can do full HD as local or web video on the RX300s. This precludes the central server from needing a dedicated GPU. Once you buy the RX300, an automatic free subscription to the vSpace Pro 10 technology automatically kicks in, but only for twelve months.

Each RX300 is an RBPi 3 model B with four USB 2 ports. They have full USB redirection and server-side device drivers that offer support for a complete range of peripherals. While running the official Linux-based Raspbian Operating System, each RBPi RX300 runs as a thin client and accesses a virtual Windows 10 desktop.

According to NComputing, the RX300 thin clients are simple to configure and receive updates from the vSpace Pro 10 servers. The CEO of NComputing, Young Song says they selected the RBPi 3 as the base for its thin clients as the board is affordable and portable.

From its vSapce Pro 10, NComputing streams a Windows desktop to a single client. For streaming desktops to several clients simultaneously, vSpace Pro 10 must be running on the Windows Server 2016 or similar. Therefore, the user will also need to purchase appropriate licenses to access the Microsoft clients.

The price per seat of a thin client deployment has now dropped and they are more cost-effective as compared to regular PCs. By using RBPis as thin clients, this claim is a definite reality.

Several industries and enterprises are now switching over to thin clients. They may have different requirements, but all share a few common goals. IT personnel exploring such goals are equivocal about the benefits of thin clients—cost, security, manageability, and scalability.

The term thin client is derived from small computers in networks being clients and not servers. The goal is to limit the capabilities of thin clients to only essential applications. That makes them centrally managed, while not being vulnerable to malware attacks. They also have a longer life cycle, use less power, and are less expensive to purchase.