An introduction to edge computing. What is it?

Edge computing refers to the process of allocating and getting data to the correct place at the right time. This is done by location computing and storage resources at the edge of a network and moving around the data. Edge computing ensures decentralization of the data process and avoids non-essential data transmissions.

There are many definitions of edge computing, however, the one shown below is one the Technology Exchange believes fits the services that we provide:

“Cloud-like capabilities located at the infrastructure edge, including from the user perspective access to elastically-allocated compute, data storage and network resources. Often operated as a seamless extension of a centralized public or private cloud, constructed from micro-data centres deployed at the infrastructure edge”.

Typical network topology for an edge computing implementation involved three major layers:

Edge computing comes with many advantages:

Why forward-thinking telecommunications providers are cloudifying their networks

Like businesses in every sector, many telecom service providers have already moved their IT infrastructure delivery and maintained to the cloud. Cloudification is an increasingly essential way for any company to cut IT operations costs while gaining the flexibility and agility needed to survive and thrive in today’s fast-changing world.

But how do telecommunication services come in and integrate with cloud services? This is an interesting question as just a few years ago, the conventional thought was that cloud computing was strictly for specialized, purpose-built appliances.

The emergence of network function virtualization (NFV) first opened the door as a core part of tomorrows telecoms network appliance. The first milestone of this journey was the virtualization of network functions. This process consists of replacing traditional network appliances with highly efficient virtualized functions delivered, using industry-standard IT equipment. Through virtualization, telcos can use standardized server hardware for multiple purposes, instead of relying on bespoke appliances.

In more recent years, two trends have advanced the cloudification of telecommunication networks first, in addition to pure virtualization, telcos have started to employ cloud business and cloud management practices such as infrastructure as a service (IaaS) for their network functions cloudification, a game changer for next generation telecommunication networks.

Why Passwords are Problematic

According to Microsoft's Professor Woodward, passwords are a decade old concept. Computer Weekly states that they are a ubiquitous part of the digital age. As such, it is imperative to look for something different and more secure to replace them.  

Passwords pose one of the biggest challenges to cyber security today. This is because they are mostly easy to guess. 

Here are some of the most common password problems: 

How to Deal With the Password Problem 

An average online user may have about 50 online profiles, requiring more than 50 passwords. For passwords to be effective, they need to have an uncommon phrase with more than eight letters.  

To increase the effectiveness of a password, Microsoft states that: 

With all these requirements, it becomes impossible to memorise 50 complex and unique passwords. 

Microsoft’s Solution 

Since March, the tech giant has launched passwordless accounts in both Microsoft and Windows products. The login process enables passwordless login prompts for users to log in to a Microsoft account to give their fingerprint (or any other secure unlocking feature) on their phone.  

Biometrics and special security keys provide a unique and secure alternative to the use of passwords. Due to identity theft and compromising passwords, 67% of banks have invested heavily in facial recognition, fingerprints, and voice recognition. This is according to a 2019 Global Banking Survey conducted by KPMG.  

According to the communique from Microsoft, only the owner of a phone can give fingerprint authentication when prompted to do so. This is more secure than the use of regular passwords.  

So, when you lose your phone or forget your details after upgrading, there are backup options including: 

Two-factor authentication will also mean you need two different recovery methods if you lose your phone.  

Companies are starting to rely on multiple factor authentications (MFA) to boost the security of accounts. This seeks to identify people in as many different ways as possible. Some of these measures include combining PINs, fingerprint scans, your location, swiping patterns, phone identity etc., to help in your identification. 

Conclusion 

According to Ali Ninkam, the CEO of Bunq, biometrics will not replace passwords. However, combining different factors through multi-factor identification will be critical in enhancing information security.  

What Does the Future Hold for High Throughput Satellites (HTS)?

Satellite technology has constantly evolved in the past 60 years since its inception, from primarily government-funded projects to, more recently, commercial ventures for many enterprises to provide data infrastructures to some of the most remote areas of the world.

Along with the increased competition from space launch providers in recent years, one of the most talked-about development is of high throughput satellites.

What is High Throughput Satellites (HTS)?

There are a few key differences between high throughput satellites compared to their conventional counterparts:

Firstly, HTS offers increased capacity with high-level frequency re-use and spot beam technology enabling multiple narrowly focused beams. While conventional FSS (fixed-satellite service), MMS (mobile satellite service) and BSS (broadcasting satellite service) conventionally use wide beams to provide coverage over a large area (over thousands of kilometres), HTS uses spot beams that focus on a narrower area (hundreds of kilometres). This reduces the range of a beam from covering entire continents or countries to towns and regions. However, HTS has multiple spot beams to cover a large area.

Secondly, because of the narrowness of using a spot beam, they are able to focus the beam on producing a throughput of up to 20+ times more than a conventional FSS. This reduces the cost per bit significantly. A conventional Ku band FSS satellite could cost $100 million per gigabit per second, ViaSat-1 could supply similar for $3 million.

Compared to a conventional Ku-band FSS satellite, HTS are capable of carrying over 100Gbit/s – 100 times more than the conventional satellite. ViaSat-1, launched in 2011, could carry up to 140 Gbit/s. This was a higher capacity of all FSS satellites serving North America combined.

The Future of HTS?

With many new launches happening every year, the number of high throughput satellites in orbit is only going to increase. While the price for launches is dropping, they are the same for both HTS and conventional satellites, nearly 20-100 times capacity but only a 50% increase in manufacturing an HTS satellite, it is a no-brainer which satellite manufacturers will choose to create.

Total investments from the 25 operators, who have combined to order nearly 100 HTS systems to date, are estimated to be over $17 billion. Another 123 HTS systems are expected to be launched over the next decade; about 78 have yet to be officially contracted and are still open to the market. Given this level of investment activity, global HTS capacity supply is set to more than triple from 680 Gbps in 2015 to 3 Tbps by 2020.

With capacity no longer a stranglehold on what businesses can use satellite technology for, we will likely see the emergence of new industries that could previously not afford the high prices. It is certain to increase the uptake of VSAT from both business and consumers along with increased backhauling for cellular providers in rural areas to name a few. Recently, ViaSat has signed a new contract with Qantas Airways to provide passengers with In-Flight connectivity using a hybrid Ku/Ka-band antenna. It is likely that these deals will only increase in the future and expand to other ventures.

In terms of market value, HTS capacity lease revenues are forecasted to jump from $1.1 billion in 2015 to -$4.9 billion by 2024, generating over $26 billion in aggregate revenues over the period.

What are the Cons?

While this increase in capacity can reduce cost per bit and provide more efficiency, they are still hindered by common problems of satellite technology.

Using high throughput satellites, you’ll need multiple spot beams to cover an entire country. For each spot beam, you will need hub infrastructure serving each. To be able to access those spot beams, you will need to be able to access the different gateways through terrestrial means, meaning that you need terrestrial (fibre) links between each of these hubs along with a central network operating centre (NOC).

Ka and Ku-band suffer more from rain-fade compared to traditional C-band beams – a common complaint among satellite TV subscribers. Despite many solutions addressing this issue (such as site diversity, adaptive coding modulation), there may still be hesitations from consumers.

Alongside this, latency will still be an issue for high throughput satellite users. While it is an improvement compared to conventional satellites, the latency is higher compared to terrestrial voice and broadband services. This has a particular effect on user experience, especially for interactive internet content – such as online gaming or OTA desktop access. O3b is addressing this issue with an MEO (mid-Earth orbit) satellite to reduce latency.

While HTS has a lot of promise and could spell the demise of the current FSS, this may not be the case (yet). A lot of mission-critical programmes still use FSS and C-band satellites due to their large footprint and reliability to manage a network from end to end. Despite the higher costs, the continued reliability could save a company from huge losses if disruption occurred.

With the increase of capacity, it does provide one problem. The capacity can only be sold cheaply if you can sell lots of it. While the capacity has increased, the problem now for satellite operators is finding the users to fill that volume. The next task for satellite operators and communication providers is marketing and filling this extra capacity to ensure promised fall in cost per bit.

How Cache Servers Can Save You Bandwidth & Money

It will be decades before terrestrial networks will be able to provide adequate and equal bandwidth across the globe. For now, pockets of highly connected regions, such as Europe and the United States, will dominate in global telecoms.

Until then, geostationary satellites will provide communication to rural regions with little/no terrestrial infrastructure. However, what do you do if you need to provide large files to a large number of users with little or no infrastructure without incurring huge costs? A cache server is the answer.

What is a Cache?

A cache is where your computer would store regularly accessed files for a limited amount until a certain requirement is met. A computer has many caches for different purposes. An example is web browsers storing frequently accessed web pages. By storing a file close to the end-user, the access time is negligible.

How Does Caching Work on A Local Level?

Compared to the data transfer speeds of the memory and motherboards, your internet connection is the slowest by a factor of 10 or 100 times.

On a local level, your computer will store a record of all the websites that you have visited and download them. This is known as a cache.

Visiting a site for the first time, your browser will download the website data, content and the time-date stamp to the cache. When revisiting a site, the browser will check the time-date stamp in the cache and compare it with the time-stamp on the website. If the timestamp in the cache exceeds the host time-stamp, it will download the website data from the cache. If the host time-date stamp exceeds the cache time-date stamp, then the browser will download the website data from the host and replace the cache version.

This is important for all users and the network provider as it reduces bandwidth usage, server load and perceived delay. Without caching, the internet would be costly, slow and wouldn’t be as commonplace as it is today.

How Does Caching Work for a Network Provider?

Before Web2.0 and content-driven websites, the website data would, most likely, download straight from the host server. This was not too much of a problem with a static website due to its vastly smaller size. But with content-based websites, this can cause problems.

As the internet has grown, it has become a large hierarchy of many cache servers to help ease the network load caused by content-based websites. Cache servers are usually operated by network providers and can operate on a national, regional and citywide basis, storing many of the most popular websites and files for their users.

For example, a user looks up a news site. The computer will look within the cache to see if it has a copy of the website with a time-date stamp that has exceeded the host’s time-date stamp. If not, it will go to the nearest cache server to their location - found by looking at their IP address. It will look at the time-date stamp on that cache server and make sure it has not expired. If it has, it will look towards the next nearest cache server, looking for a copy that it has not expired. It will continue looking at cache doing this until a copy has been found or, after a number of failed attempts, is redirected to the host server. Once a copy is found, it will check to make sure the content is still fresh. If not it will, download a copy from the host server. This is then copied through all the caches it passed until it reaches the original user.

The same method is applied to large content items, such as videos. Watching Netflix programmes, it is likely that you are actually watching a copy based on a local cache rather than from the Netflix's host servers.

Why Is Caching Important to VSAT Users

In areas not served by terrestrial internet, installing a cache server is not only important, it is necessary. Without one, it would be too expensive to provide internet to that area.

Despite the price per Mb per month falling, satellite internet currently is the most expensive form of internet connection. However, in places that lack terrestrial internet, it is usually the only option for getting online. This is set to change in the coming years.

For companies and network providers, installing a content cache server and creating a content distribution network (CDN) may be beneficial for VSAT users. With the ability to create content cache servers at strategic places, you are able to send large files without bandwidth being overrun. This is because you will only send one copy of the file to that cache server. On a satellite connection, this can save a lot of money and bandwidth, especially if the connection is a gateway for many users.

When a user requests to download the file, the website will redirect the user, based upon IP address location, to the nearest cache server. This not only saves bandwidth and money but results in a better experience for the user. Prices for cache servers have dropped dramatically over the past five years, primarily due to fall in hardware and memory prices - making ROI much more achievable.

An Example of a Single Site Connected via Satellite with Many Users

A company produces an operating system that is used by millions of people around the world. Every month, the company has to issue an update to its users to ensure their operating system remains secure and efficient.

The updates are not small, approximately 1000Mb in size. Over terrestrial, this is no problem as network providers will recognise this update will likely be downloaded by lots of users within a few days, so will store it in their caches.

But for VSAT users, it is trickier. It would cost more money to download to transfer 1000Mb over a satellite internet connection, compared to terrestrial internet. If your VSAT connection is the gateway for many users, downloading this tens/hundreds/thousands of times could be a costly endeavour.

So for companies/NGOs who have VSAT users, installing a cache server at the satellite receiver will mean that only one copy of the update will have to travel the satellite connection. Then, the cache server will store it and duplicate it to anyone on the local level wishing to download the update.

By utilising this method, it has reduced the cost of downloading the update by a large factor.

How Will Increased Competition within the Space Launch Industry Affect VSAT Providers?

Within the past 5 years, there has been a major change in providers of commercial space launches. Prior to 2012, nearly all space launches, whether for commercial or for governmental purposes, used a vehicle that was previously a part of government-funded projects. But now, space launch providers, such as SpaceX, ULA and Arianespace have been providing their own launch vehicles – creating a cost war to bring prices down.

With satellites requiring a costly launch price before entering service, will the increased competition from launch providers mean better deals for VSAT/communication providers and their customers too?

What Has Happened in the Past 5 Years?

Prior to 2012, most launches were government-funded or carried out by independent contractors on the behalf of the government. In the USA market, this was dominated by ULA (United Launch Alliance) that served NASA, the US department of defence and others. Elsewhere, it had been dominated by Europe’s Arianespace.

However, in 2008, SpaceX became the first provider to launch privately-funded liquid fuel rockets to orbit, docked with the ISS in 2012 and, by 2013, had launched its first geosynchronous satellite. But their main selling point was their record-low cost per launch with their Falcon 9 rocket. The price of US$56.5million, it was the cheapest compared to Europe’s Ariane 5 or ILS’s (International Launch Services) Proton vehicle. In comparison, Chinese-launch (using Long March 3B) would cost US$15 million more.

SpaceX was able to achieve such low launch costs by manufacturing the entire rocket in-house with ‘no middlemen’ and specific design choices. After SpaceX successful space launch at a reduced cost, other providers are looking cut overheads. This was after several European satellite operators requested that the ESA find ways to reduce the Ariane 5 rocket launch costs.

How Have VSAT and Satellite Operators Responded?

EutelSat, one of Talia’s partners, announced they plan to save nearly 20% by using the increased competition between Arianespace and SpaceX, along with changing the propellant from chemical to electrical, depending on the payload.

As SpaceX gain market share and expands its manufacturing to accommodate demand, costs should be reduced even further through the economy of scale. This will cheaper launch costs for satellite operators, which will be cheaper for VSAT providers and, ultimately, be cheaper for VSAT customers.

Future Prices and Industry Effects?

Currently, SpaceX is designing reusable boosters for stage 1 of launch. Stage one launches the vehicle from the ground into the upper atmosphere before stage 2 rocket thrusts it into orbit. By designing the stage 1 burners to be reusable compared to conventional boosters, which are disposed of in the atmosphere or the ocean, could save time and money on manufacturing. Considering that the fuel costs US$200,000 and nearly US$60million goes to the manufacturing of the Falcon 9 rocket, the savings can be enormous.

However, this may still be a while off. The first reusable landing happened in December 2015 and will require testing and fine-tuning until it becomes the main part of their product line.

Once reusable stage one rockets become mainstream from SpaceX and other launch providers, this will have vastly decreased from possible hundreds of millions to tens (or maybe even less). Some figures have been estimated launch costs of US$40million within the near future.

This reduction of price will likely see increase competition from satellite manufacturers, who will likely try and decrease their costs, and an influx of new geostationary satellites for a multitude of purposes. Subsequently, we will likely see new VSAT providers and other communication companies utilising satellite technology.

This increased competition will likely see prices fall for customers and we will see an increase of new customers. However, it is unlikely that this will replace terrestrial internet for general consumption and day-to-day use for the masses – at least anytime soon.