Gigabit LTE will ensure a smooth transition from 4G to 5G by providing wide-area coverage, seamless mobility and a more consistent user experience. But how is it possible that in such a short space of time, these mobile speeds are even conceivable?
It is 2018 and we are achieving mobile speeds of over 800 MB/s (and some network operators are even claiming speeds of up to 1.9GB/s; albeit under test conditions), yet just twenty years or so ago the world didn’t have mobile internet at all. In fact, in the early years of mobile connectivity, many would have never believed it were possible to achieve these speeds, at least not in such a short space of time.
So, how have we got to this point so quickly and what are the technologies behind the drive us towards gigabit mobile speeds and beyond?
In the late 70’s, cellular networks were introduced commercially and they completely revolutionised existing connectivity. Up until then, any kind of mobile system were based on simple analogue radio technology and restricted to a single area such as a town or city. In many cases these phone systems weren’t full duplex, meaning that only one person could talk on a call at a time (much like a ‘walkie-talkie’ system).
Instead, the then new cellular networks were able to divide the coverage in to cells (as the name suggests), meaning that the same frequencies could be used for different calls, as long as they took place in different cells.
These early cellular networks were great at the time, and certainly a big leap forward from the simple analogue radio technologies of the past, but in order for carriers to increase their user base they needed a standard (set of manufacturer guidelines) that would enable cross compatibility.
In the early 90’s, GSM was rolled out across Europe. It started as a group (Groupe Speciale Mobile), initially set up by the European Conference of Post and Telecommunications Administrations, tasked to define a European Mobile network standard, but soon the name GSM was being used to describe the set of technologies that it encapsulated.
These technologies included TDM (time-division multiplexing), call compression and authentication & encryption.
GSM didn’t come without its difficulties but it was these challenges that led to breakthroughs in many areas. For example, the new digital systems allowed other types of data to be carried; other than voice. SMS and mobile fax emerged from GSM and so too did WAP (Wireless Application Protocol) which enabled simple text based web browsing on mobile devices.
The initial issue with WAP was that GSM could only provide speeds of up to 9.6KB/s; nowhere near enough to deliver high-quality video streaming or images that we are use to today. Instead, it allowed for users to do things like check the news headlines, view sport cores or read short text based emails. Not only were these services fairly limited, but the phones themselves hadn’t been developed with high-resolution colour screen at that time, so if you were lucky enough to benefit from the services, they weren’t up to much visually.
To offer true online services (that customers were getting use to experiencing on their home computers), on handheld mobile devices, further advances in technology needed to happen.
The first enhancement came at the turn of the millennium, when GPRS 9General Packet Radio Service) networks were introduced. This boosted speeds to between 35 and 171KB/s.
This was closely followed by EDGE (Enhanced Data Rates for GSM Evolution), which again boosted speeds even further, to between 120 and 384KB/s.
These developments came at the perfect time for network operators, as manufacturers began to release phones capable of high-resolution colour display, camera technology and web browsing.
The 3GPP (Third Generation Partnership Project) was formed to prepare the industry for a new third generation of mobile networks and importantly increase data rates to be able to offer image uploading and downloading, online gaming and video streaming.
The upgrade defined a new standard based on UMTS (Universal Mobile Telecommunications System). This utilised HSPA (High Speed Packet Access) to accelerate speed up to 14.4MB/s for download and 5.76MB/s for uploads.
Thereafter, a further upgrade to HSPA was released, called HSPA+ which increased the speed furthermore to 42MB/s download and 22MB/s upload.
At this point smartphones began to take over the market and users were welcoming access to online apps, browsing, video streaming and most of the content we are used to viewing today, albeit at a slower rate. Manufacturers, network operators and consumers were all winning, with new devices being released which boasted the ‘best cameras’ available and access to online services.
It wasn’t until the emergence of things like live streaming and cloud based storage, that consumers began to demand better speeds from their providers and so came the arrival of 4G.
In order to offer a service with even further improvements to data rates, 3GPP proposed evolving the existing GSM/UMTS technology rather than looking for something new or an alternative. This evolution came to be known as LTE (Long Term Evolution).
The architecture used in LTE was designed to surpass the mobile data rates that were available using 3G technologies. In 3G networks, the radio network controller or RNC, controlled what were called NodeB base stations in the network. However, with LTE networks, the base stations had an embedded control functionality which was called eNB (evolved NodeB), removing the need for an RNC altogether.
This simplified, flatter version of the network architecture mean response times are much quicker and therefore users of the network would realise much better data rates, boasting speeds of up to 100MB/s.
Since LTE, there have been further improvements such as LTE-A and LTE-A-Pro, which have all superseded the speed of its predecessor.
LTE-A (advanced) improved on the architecture of LTE, before being outmoded by LTE-A Pro which aimed to not only improve the existing network but prepare itself for the introduction of 5G in the next few years.
Many might argue that we have reached a point in mobile communications whereby we can achieve what we need to, at the speeds we need to; and that now anything faster would be meaningless or unnoticeable. However, with the emergence of new technologies such as the internet of things and with global ambitions of smart-cities, smart-farming and connected people, the current data rates would not suffice.
Instead, it is important to keep striving to improve our current technologies and allow for further increase of upload and download speed, whilst bringing down latency rates.
Passing the gigabit mark has, on paper, already been done (at least in lab conditions), but realistically before we are able to realise these speeds in a real-world scenario, there are many further obstacles to overcome.