Has the adoption of new technology increased due to the pandemic?

The COVID-19 pandemic has impacted virtually everyone on earth. It has also greatly affected the way in which society operates, from business to education to the global supply chain.

One of the most prominent changes brought about by the crisis is the use of technology in these areas, and how only months of a global pandemic have managed to initiate what could be considered years of technological integrations.

As the response to a pandemic is to separate people so that they don't transmit a novel virus, technology quickly became the solution to bridge the gaps created by it.

Consumers have increasingly turned to online channels during the pandemic, and businesses and industries alike have responded in kind.

For example, many businesses have implemented online workplaces, with virtualised teams now commonplace. Something that may not have been considered practical prior to the crisis, remote work was the only option for many businesses forced to close offices during the initial shutdown. Boardrooms quickly turned virtual, and operations close to completely digital. As these changes grew in popularity, so too did companies' willingness to implement them into their business models across the board.

From this, technology such as video conferencing, remote collaborative tools and digital file storage and transfers have become more commonplace. This has really opened a door to companies within the sector, to further innovate and improve their products and services.

Many companies, both big and small, were forced to re-evaluate their online strategy at the onset of the pandemic. Many businesses quickly found that the technology and digital presence they had prior to the crisis wasn't going to hold up to meet their entire demand.

Because of this, many organisations became much more digitally literate, allowing themselves to be found easier, and operate more efficiently, online. This has also greatly adapted the way in which society works, as we now have a market that is notably more technologically integrated than before the pandemic.

Furthermore, COVID-19 and the technological relevance that has come with it has prompted many companies to explore automation, reimagining the way in which processes such as product manufacturing and delivery are done. This has seemed to work very well for industries such as the automotive industry, with many companies looking into automating their assembly lines.

An immediate need for dependency on technology has forced society to expedite its natural progression of digital innovation, causing an immediate need to make what would have been considered otherwise, abrupt changes. Things such as QR codes, which used to be considered mainly a novelty, are now extremely mainstream and a practical necessity for some people.

As the pandemic progressed, society evolved with it, shifting towards a total alignment with technology. These technological changes, which were fairly new prior to the pandemic crisis, have now become commonplace in society. The general population has seamlessly integrated these into their daily lives. Whether or not some of these changes are good or bad is debatable to many, but undoubtedly a huge advance in a myriad of ways with the ability to affect things in an ongoing way.

How is spatial computing evolving?

As virtual reality technology becomes too prominent in the market, the field of spatial computing is evolving to create new ways for people to understand interactions with the physical world.

So, what is spatial computing?

Spatial computing combines virtual reality with augmented reality that uses text, sound, and images to enrich users’ insight and experience. By creating a spatial model of a city, it is no longer a flat, two-dimensional object on a computer screen; it is a three-dimensional virtual world that users can inhabit in this ‘mixed reality’ mode. 

Mixed reality in spatial computing means a way for city planners and others to interact with part of the city and stimulate scenarios to understand and analyze the effects of any changes. For instance, when a city planning department wants to add to the built environment of the city, research and engineers must study the impacts of the planned changes on the city and its citizens. Traditionally, this would have required them to prepare large data sets of detailed documents and calculations that may only have seen understood by a select group of people and could go out of date over the court of the project.

Today, architecture plans are accompanied by virtual models that showcase what a new building will look like in its intended location. Spatial computing goes further than that by presenting and its surroundings as mixed reality. Using this, together with existing data, the construction of the new building and any other changes can be simulated to identify impacts on the environments and people. For example, if data on current noise pressure, air quality and so on, is gathered and stored, it can be used in a spatial computing model in combination, for example, with sound reflection of sound absorption rates of the proposed design and construction materials. The impact of a new building on a natural air flow, for example, can be visualized showing any related effects on air quality and possible heat spots. 

How 5G will impact the telecommunications industry

The groundbreaking rise of 5G technology is just around the corner and is expected to completely change the telecommunications industry and how people use technology altogether.

This is particularly exciting because it has only been a decade since 4G was introduced, however, it is designed to have a higher bandwidth, a lower latency of 1-2 milliseconds, and the ability to connect to more devices.

Clearly, drastic changes are being made to the ever-growing telecoms industry, and companies need to keep up to stay ahead.

So, what is so different about 5G?

To put it simply, 5G networks are essentially more capable. Processing more connections per power and faster speeds per user. 5G is also planned to work across a wider range of radio frequencies, allowing for new possibilities in the ultra-high millimetre wave bands for carriers to expand their network offerings.

5G is also intended to have strong enough resilience to carry out activities as major as remote medical surgeries. This is due to the different technologies broadcasting 5G signals at different spectrums for resilience, providing a 0-error margin.

Speeds are enforced to reach more than 1 gigabyte per second. To put this in perspective, users will be able to download films in a matter of a few short seconds. It is argued that 4G is incomparable to this fifth generation of wireless connectivity, because of the innovative way it can operate on new frequencies and systems.

Additionally, 5G will play a significant role in shaping the new age of technology. From the Internet of Things to smart homes, self-driving cars, telemedicine and more, the deployment of 5G networks will propel the reinvention of several businesses and consumer demands.

Research states that 5G will boost global economic output by 12.3 trillion dollars by 2035, demonstrating a significant impact.

Moreover, telecoms businesses will be heavily altered. Aside from improved data rates, ultra-low latency will bring new levels of network responsiveness, which are set to complement emerging technologies.

To name a few:

As a result, companies can play a more assertive role in their industry value chain by having a more intimate relationship with their networks.  

With this innovative breakthrough in technology, adaption and reconstruction in the telecoms industry will need to take place. Traditional businesses must adapt to the changing digital environment and research suggests that digitalization provides a good opportunity for a company’s growth.

As broadband connectivity continues to digitally progress, it only makes sense that the telecoms industry does the same to stay ahead.

Digitalization offers telecom companies the opportunity to rebuild their market position, reimagine their business systems and create innovative offerings for customers. The European Telecommunications Network Operators state that the companies that will reap the most benefits from the advancements are those that transform from being connectivity providers to a digital service provider.

How Can DDoS Affect Your Business?

Since the start of the millennium, cybercrime has become an ever-increasing threat to society as our daily lives have steadily moved into the digital realm.

I am sure a lot of you watching this use the Internet or online applications to undertake your day-to-day job. Therefore, whether you work in a Fortune 500 company or a small firm, your business is at risk of a cyber-attack.

Implementing effective DDoS protection is key to ensuring your web property is secure and that you are ready to fight off any raids.

A distributed denial of service attack or ‘DDoS attack’ is a common type of cyber-attack where an attacker aims to impair the function of a web server by overwhelming it with fake traffic.

So how do these attacks work?

Network resources, such as websites, have a finite limit to how many users can access their server at a time. Simultaneously, the system that connects the server to the Internet will also have limited bandwidth and capacity. Criminals, using a network of connected online devices, collectively known as a Botnet or ‘zombie network’, will send an extremely large number of requests to their victim to saturate the server with huge volumes of traffic. These connected devices, often PCs, routers, or mobile devices are infected with malware that enables attackers to gain control so that the attacks come from devices across the internet making it harder to detect and deflect. These actions mean the server will exceed capacity and the level of service will be hindered and suffer in several ways:

Firstly, the response to requests will be much slower than normal, or secondly, some – or all – users requests may be totally ignored. So, if for example you are running an e-commerce website, and your customers are unable to reach your site, you would see a reduction in revenue. If you are a banking institution, and your staff are unable to access online systems to process requests, this would have a significant impact on your clients. In addition, the very nature of DDoS attacks means that they can affect a business for a short period or for weeks, even months.

But, what to attackers wish to achieve with these attacks?

Motivations behind a DDoS attack may differ. In some instances, it may be to bring down a competitor or for financial gain or a smokescreen for other nefarious activity. Others can be unfortunate random attacks of cyber-vandalism. Nonetheless, protecting your business is essential.

By implementing software that detects attackers and redirects them to a scrubbing centre where infectious data is cleaned, while still allowing for the clean traffic to pass through, instead of having to shut off a web server completely and making it inaccessible to real users. In modern Internet, DDoS traffic comes in many forms. Therefore, it is important for a company to evaluate what type of DDoS attack they are most vulnerable to experiencing and apply a protection program accordingly.

These days there are hundreds of different methods of DDoS attacks, however they often fall into 3 main types.

  1. Volume attacks, which employ methods that generate massive volumes of traffic completely saturates bandwidth, creating traffic jams that make it impossible for legitimate traffic to flow into or out of target site.
  2. Protocol attacks that are designed to eat up the processing capacity of network infrastructure resources like servers and firewalls with malicious connection requests. These target Layer 3 and 4 of the 7-layer Open System Interconnection model used to define modes of interconnection between different components in a networking system.

And lastly, number 3, application attacks which exploit weaknesses the application layer; Layer 7 – where the end-user application resides. These are some of the more sophisticated DDoS attacks that open connections and initiate process and transaction requests that consume resources like disk space and memory.

By understanding potential vulnerabilities in your system, you can identify threats and plan to reduce or limit their effect on your business.

How customer experiences continue to improve through digital transformation

Not long ago, data was hosted in a data centre located close to the company itself. With time, network and service providers started prioritizing their client’s needs and began evolving their solutions to be more time-efficient, efficient, and simple for the end-user. Therefore, within the last few years, the digital world has transformed massively because businesses around the world have migrated their data onto the cloud, instead of stowing it within physical networks. This method is the reason for the more flexible, streamlined nature of workers today.

As the new normal in companies becomes more technologically reliant, the pace in which technology changes in increasing too. Moreover, it is more crucial now than ever to rethink the role of IT within your company to ensure it reflects the merge of IT, as well as business strategies. This reflects how organizations are changing and requiring higher bandwidth and flexible voice services, along with the pressure to migrate more processes to the cloud.

For many businesses around the world, the technological revolution is well on its way to transform a company forever. Groundbreaking connectivity is a massive enable across all industries. The most sought-after business applications are also the ones that bring people together, increase team morale and collaborations and while also enhancing the element of speed in workflows. Just by looking at research conducted in this field, it can be concluded that this is less likely to happen without enterprise-grade connectivity and relevant voice servers.

For a complete revamp of your business network to keep up with the constant digital transformation of our world contact Technology Exchange today. We are here to simplify this process and provide you with the best IT infrastructure to suit your requirements to reach your business’ goals.

How cyber-security attacks have increased with the COVID-19 pandemic

The era of the COVID-19 pandemic will now go down in history as one of the most uniquely unruly times, but not only for the outbreak of the deadly virus. The virtual world was too, altered, as the Internet became a lifeline for many businesses and employees alike.

The year 2020 is going to be remembered as the start of the Coronavirus, however, what many do not realize, is that the virtual world was equally as plagued. Cyberspace become what many rely on to continue daily tasks and keep businesses running. It enabled 80% of the economy to stay considerably unaffected by the threatening pandemic. Excluding some industries that completely shattered during the spread of the virus, for example, a large proportion of the hospitality industry, the Internet acted as a lifesaver for people around the world.

Nonetheless, with the rapid increase in Internet activity came an exponential increase in malicious cyber-attacks, making what is referred to as the ‘cyber pandemic. Therefore, the worldwide shift onto the virtual world enabled a huge reception for cyber-threats.

The Al-Kuwaiti news reported that the UAE, for example, experienced a 250% increase in cyber-attacks in 2020 alone. They stated that this was because organizations had no choice but to reconsider how they worked and performed operations, and cyber-attackers took advantage of the increased digital implementations. Additionally, frustrations grew as the forms in which they came in became increasingly unpredictable too. This confused workers across the field of security on how to react and act quickly to this unprecedented change in the environment.

The type of attacks differed too. The World Health Organization (WHO) reported that 450 members of their employees’ passwords and emails were leaked in just one week. Since then, WHO has migrated onto a more secure authentication system to protect their workers. Experts in the digital world state that throughout 2020, people around the world were inadvertently trusting and surfing the web, using their sensitive personal and biometric data, that was so easily accessible to cybercriminals. Due to this online pandemic, technology and services are now shifting their priorities to support the current, urgent need for business continuity, remote working, and planning for the transition of the next ‘normal’ in the online world.

At Technology Exchange, we are here to help you migrate your business to safe and effective cyberspace tailored to you, using industry-leading technologies that will protect your staff and corporate data at all costs. Contact us today, to begin planning the implantation of a new and improved cyber security program for your company.

An introduction to edge computing. What is it?

Edge computing refers to the process of allocating and getting data to the correct place at the right time. This is done by location computing and storage resources at the edge of a network and moving around the data. Edge computing ensures decentralization of the data process and avoids non-essential data transmissions.

There are many definitions of edge computing, however, the one shown below is one the Technology Exchange believes fits the services that we provide:

“Cloud-like capabilities located at the infrastructure edge, including from the user perspective access to elastically-allocated compute, data storage and network resources. Often operated as a seamless extension of a centralized public or private cloud, constructed from micro-data centres deployed at the infrastructure edge”.

Typical network topology for an edge computing implementation involved three major layers:

Edge computing comes with many advantages:

Why forward-thinking telecommunications providers are cloudifying their networks

Like businesses in every sector, many telecom service providers have already moved their IT infrastructure delivery and maintained to the cloud. Cloudification is an increasingly essential way for any company to cut IT operations costs while gaining the flexibility and agility needed to survive and thrive in today’s fast-changing world.

But how do telecommunication services come in and integrate with cloud services? This is an interesting question as just a few years ago, the conventional thought was that cloud computing was strictly for specialized, purpose-built appliances.

The emergence of network function virtualization (NFV) first opened the door as a core part of tomorrows telecoms network appliance. The first milestone of this journey was the virtualization of network functions. This process consists of replacing traditional network appliances with highly efficient virtualized functions delivered, using industry-standard IT equipment. Through virtualization, telcos can use standardized server hardware for multiple purposes, instead of relying on bespoke appliances.

In more recent years, two trends have advanced the cloudification of telecommunication networks first, in addition to pure virtualization, telcos have started to employ cloud business and cloud management practices such as infrastructure as a service (IaaS) for their network functions cloudification, a game changer for next generation telecommunication networks.

Why Passwords are Problematic

According to Microsoft's Professor Woodward, passwords are a decade old concept. Computer Weekly states that they are a ubiquitous part of the digital age. As such, it is imperative to look for something different and more secure to replace them.  

Passwords pose one of the biggest challenges to cyber security today. This is because they are mostly easy to guess. 

Here are some of the most common password problems: 

How to Deal With the Password Problem 

An average online user may have about 50 online profiles, requiring more than 50 passwords. For passwords to be effective, they need to have an uncommon phrase with more than eight letters.  

To increase the effectiveness of a password, Microsoft states that: 

With all these requirements, it becomes impossible to memorise 50 complex and unique passwords. 

Microsoft’s Solution 

Since March, the tech giant has launched passwordless accounts in both Microsoft and Windows products. The login process enables passwordless login prompts for users to log in to a Microsoft account to give their fingerprint (or any other secure unlocking feature) on their phone.  

Biometrics and special security keys provide a unique and secure alternative to the use of passwords. Due to identity theft and compromising passwords, 67% of banks have invested heavily in facial recognition, fingerprints, and voice recognition. This is according to a 2019 Global Banking Survey conducted by KPMG.  

According to the communique from Microsoft, only the owner of a phone can give fingerprint authentication when prompted to do so. This is more secure than the use of regular passwords.  

So, when you lose your phone or forget your details after upgrading, there are backup options including: 

Two-factor authentication will also mean you need two different recovery methods if you lose your phone.  

Companies are starting to rely on multiple factor authentications (MFA) to boost the security of accounts. This seeks to identify people in as many different ways as possible. Some of these measures include combining PINs, fingerprint scans, your location, swiping patterns, phone identity etc., to help in your identification. 

Conclusion 

According to Ali Ninkam, the CEO of Bunq, biometrics will not replace passwords. However, combining different factors through multi-factor identification will be critical in enhancing information security.  

What Does the Future Hold for High Throughput Satellites (HTS)?

Satellite technology has constantly evolved in the past 60 years since its inception, from primarily government-funded projects to, more recently, commercial ventures for many enterprises to provide data infrastructures to some of the most remote areas of the world.

Along with the increased competition from space launch providers in recent years, one of the most talked-about development is of high throughput satellites.

What is High Throughput Satellites (HTS)?

There are a few key differences between high throughput satellites compared to their conventional counterparts:

Firstly, HTS offers increased capacity with high-level frequency re-use and spot beam technology enabling multiple narrowly focused beams. While conventional FSS (fixed-satellite service), MMS (mobile satellite service) and BSS (broadcasting satellite service) conventionally use wide beams to provide coverage over a large area (over thousands of kilometres), HTS uses spot beams that focus on a narrower area (hundreds of kilometres). This reduces the range of a beam from covering entire continents or countries to towns and regions. However, HTS has multiple spot beams to cover a large area.

Secondly, because of the narrowness of using a spot beam, they are able to focus the beam on producing a throughput of up to 20+ times more than a conventional FSS. This reduces the cost per bit significantly. A conventional Ku band FSS satellite could cost $100 million per gigabit per second, ViaSat-1 could supply similar for $3 million.

Compared to a conventional Ku-band FSS satellite, HTS are capable of carrying over 100Gbit/s – 100 times more than the conventional satellite. ViaSat-1, launched in 2011, could carry up to 140 Gbit/s. This was a higher capacity of all FSS satellites serving North America combined.

The Future of HTS?

With many new launches happening every year, the number of high throughput satellites in orbit is only going to increase. While the price for launches is dropping, they are the same for both HTS and conventional satellites, nearly 20-100 times capacity but only a 50% increase in manufacturing an HTS satellite, it is a no-brainer which satellite manufacturers will choose to create.

Total investments from the 25 operators, who have combined to order nearly 100 HTS systems to date, are estimated to be over $17 billion. Another 123 HTS systems are expected to be launched over the next decade; about 78 have yet to be officially contracted and are still open to the market. Given this level of investment activity, global HTS capacity supply is set to more than triple from 680 Gbps in 2015 to 3 Tbps by 2020.

With capacity no longer a stranglehold on what businesses can use satellite technology for, we will likely see the emergence of new industries that could previously not afford the high prices. It is certain to increase the uptake of VSAT from both business and consumers along with increased backhauling for cellular providers in rural areas to name a few. Recently, ViaSat has signed a new contract with Qantas Airways to provide passengers with In-Flight connectivity using a hybrid Ku/Ka-band antenna. It is likely that these deals will only increase in the future and expand to other ventures.

In terms of market value, HTS capacity lease revenues are forecasted to jump from $1.1 billion in 2015 to -$4.9 billion by 2024, generating over $26 billion in aggregate revenues over the period.

What are the Cons?

While this increase in capacity can reduce cost per bit and provide more efficiency, they are still hindered by common problems of satellite technology.

Using high throughput satellites, you’ll need multiple spot beams to cover an entire country. For each spot beam, you will need hub infrastructure serving each. To be able to access those spot beams, you will need to be able to access the different gateways through terrestrial means, meaning that you need terrestrial (fibre) links between each of these hubs along with a central network operating centre (NOC).

Ka and Ku-band suffer more from rain-fade compared to traditional C-band beams – a common complaint among satellite TV subscribers. Despite many solutions addressing this issue (such as site diversity, adaptive coding modulation), there may still be hesitations from consumers.

Alongside this, latency will still be an issue for high throughput satellite users. While it is an improvement compared to conventional satellites, the latency is higher compared to terrestrial voice and broadband services. This has a particular effect on user experience, especially for interactive internet content – such as online gaming or OTA desktop access. O3b is addressing this issue with an MEO (mid-Earth orbit) satellite to reduce latency.

While HTS has a lot of promise and could spell the demise of the current FSS, this may not be the case (yet). A lot of mission-critical programmes still use FSS and C-band satellites due to their large footprint and reliability to manage a network from end to end. Despite the higher costs, the continued reliability could save a company from huge losses if disruption occurred.

With the increase of capacity, it does provide one problem. The capacity can only be sold cheaply if you can sell lots of it. While the capacity has increased, the problem now for satellite operators is finding the users to fill that volume. The next task for satellite operators and communication providers is marketing and filling this extra capacity to ensure promised fall in cost per bit.