web analytics

Archive for the ‘Cybercrime’ Category

Cold War Tech: It’s Still Here, And Still Being Used

I’m a Cold War kid. I grew up watching news of Pershing II and SS-20 deployments in Europe, the Soviet war in Afghanistan, with some Terminator and Top Gun VHS action on the side. Yugoslavia was trying to play both sides, and for a while it worked like a charm. It all crashed a couple of years after the Berlin Wall came tumbling down, rendering our unaligned prowess pointless.

I admit this is an odd intro for a tech blog, but bear with me; it will start to make sense. Unlike most Europeans, we had good relations with both blocs. We sold tanks to Kuwait and rocket artillery to Saddam, we bought cheap fuel and MiGs from the Soviets, and in return we exported some stuff they couldn’t get directly from the West. I know people who would stay in East Berlin hotels because they were cheaper, then cross the border into West Berlin to work, play and shop, only to cross back via virtually unused border crossings like Checkpoint Charlie, all in a matter of hours.

On one such trip, my dad got me a Commodore C64, which was pressed into service as our Cold War gaming machine. Most 80s video games, and indeed a lot of music and films, were inspired by countless proxy wars and the threat of a nuclear apocalypse. As the Wall came down, a lot of people assumed that would be the end of runaway defence spending and that the world would be a safer place. It didn’t exactly work out that way, did it?

However, the long-term effect of the Cold War on science and technology is more profound than Nena’s 99 Luftbalons, or any Oliver Stone Vietnam flick.

Minuteman: A Cold War Tech Case Study

If you are reading this, you’re already using a technology developed for cold warriors; The Internet. That’s not all. A lot of tech and infrastructure we take for granted was developed, or at least conceived, during these tumultuous decades.

That constellation of GPS satellites orbiting Earth? It wasn’t put up there to geotag selfies or get an Uber ride; it was designed to help the US Strategic Air Command deliver hundreds of megatons worth of instant sunshine on Soviet targets with pinpoint accuracy. Integrated circuits, transistors, solid-state computing? Yep, all developed for the armed forces and paid for by the US taxpayer.

Here is just one example: the sleek and unfathomably deadly LGM-30 Minuteman intercontinental ballistic missile (ICBM). It wasn’t the first ICBM out there, but when it appeared, it was revolutionary. It was a solid fuel missile, which meant it could respond to a threat and launch in a minute without having to be fuelled, hence the name. But solid fuel was only part of the story: Solid-state was a lot more interesting from a geek perspective. Prior to Minuteman, ICBMs relied on analogue computers with mechanical gyros and primitive sensors. Since they were wired to a specific target, the target package could not be changed easily. Minuteman was the first mass implementation of a general purpose digital computer; it integrated an autopilot and missile guidance system in one package, with reliable storage that could take the stress of a silo launch. The computer was also capable of storing multiple targets, and was reprogrammable.

Transistors were nothing new at that point; they were developed years before by Bell Labs. Yes, these primitive transistors were almost exclusively reserved for the military-industrial complex. Uncle Sam was the sole customer for virtually all early computers and chips, burning heaps of money. These early transistors offered a quantum leap over vacuum tubes, but they weren’t perfect. By today’s standards, they were rubbish. The reliability simply wasn’t there, and if you needed to launch a few hundred thermonuclear warheads halfway across the planet, you sort of needed a guidance system that wouldn’t fail as soon as the candle was lit.

So what do you do when you come across a technical problem you can’t solve with money? Simple: You throw more money at it, and that’s exactly what the US Air Force did. They burned millions to make the damn things reliable enough to be used in harsh environments and survive the stress of a high-G ascent to space. This was known as the Minuteman High Reliability (Hi-Rel) programme.

The first truly mobile digital computer was somewhat deadlier than your notebook and iPhone.

The first truly mobile digital computer was somewhat deadlier than your notebook and iPhone.

It worked, but the USAF got a bit more than they bargained for. In trying to improve a single weapons system, the USAF ended up giving a huge boost to the tech industry in general. Eventually, the Minuteman was upgraded to include a new microchip-based guidance system, with a primitive form of solid-state storage. This Cold War relic has been in service since the Kennedy administration, and the current incarnation has been around for 45 years, receiving multiple hardware and software updates over the years.

So, in outlining the development and evolution of a single strategic weapon delivery system, I have touched on a number of vital technologies we take for granted: reliable transistors, chips, solid-state storage, mass-produced programmable computers and so on. The Minuteman was also the first mobile digital computer.

Some may argue that the legacy of such weapons is that Mutually Assured Destruction (MAD), guaranteed by the nuclear triad, kept superpowers from going to all-out war. It probably did, but in doing so, it also allowed engineers around the world to develop technologies and concepts applicable in various industries and fields of study.

Their real legacy lies in every integrated circuit on the planet.

Capitalist Pioneers Try To Cash In

What could be more capitalist than monetizing instruments of mass murder? The taxpayers paid for their development, not venture capitalists!

Joking aside, it can be argued that the Red Scare of the fifties created Silicon Valley. Most of the money really did come from taxpayers, and most companies that got lucrative defence contracts were quick to make a buck on dual-use technology developed for the military. Remember Bell Labs? A few of their brightest people went on to co-found Fairchild Semiconductor, and eventually created Intel a decade later. The updated Minuteman guidance computer was based on chips from another semiconductor giant: Texas Instruments.

I am not disputing the brilliance of people like Intel co-founders Robert Noyce and Gordon Moore. I have no doubt they would have made their mark on the tech industry even without the biggest arms race in history, but it’s also hard to dispute that the tech industry wouldn’t have developed at nearly the same pace had there been no government funding. Yes, the taxpayers effectively subsidised the tech industry for decades, but in the long run, they’re probably better off. Westinghouse did not need subsidies to develop washing machines and refrigerators, because consumer demand was strong, but in the early days of computing, there was virtually no consumer demand. That’s why governments had to step in.

But what did the taxpayer get?

The Internet, GPS, reliable transistors and chips: Cold War tech made possible by runaway defence spending.

The Internet, GPS, reliable transistors and chips: Cold War tech made possible by runaway defence spending.

The space and arms race spawned a number of technologies that in turn created countless business opportunities. Even primitive computers had a profound impact on industry. They made energy grids and transportation infrastructure more efficient, helped improve safety in industrial facilities, including sensitive chemical and nuclear facilities, they changed the face of banking, communications, entertainment and so on.

Best of all, we somehow managed not to blow ourselves up with the weapons these technologies made possible, yet at the same time, we turned swords into ploughshares. Back in the fifties, the US and USSR launched initiatives designed to examine civilian uses of nuclear power (including civil engineering nuclear explosives schemes, which went terribly wrong), but they amounted to nothing. It wasn’t the might of the atom that changed the world, it was the humble microchip and ancillary technologies developed for countless defence programmes.

Before they made their mark in science and beat Gary Kasparov at the chess table, supercomputers and their analogue predecessors were used to simulate physical processes vital in the development of thermonuclear weapons. An advantage in sheer computing power could yield advances in countless fields. Computer simulations allowed western navies to develop quieter submarines with new screws, digitally optimised to avoid cavitation. Digital signal processors (DSPs) made sonars far more sensitive, and a couple of decades later, advanced DSPs made music sound better. Computer aided design wasn’t just used to reduce the radar cross-section of airplanes, it also made our buildings and cars cheaper, safer and more energy efficient.

Some of these efforts resulted in a technological dead-end, but most did not. One of my favourite tech duds was Blue Peacock, a British nuclear landmine (yes, landmine, not bomb), weighing in at 7.2 tons. Since it relied on early 50s technology and had to be buried in the German countryside, the engineers quickly realised the cold could kill the electronics inside, so they tried to figure out how to keep circuits warm. Their solution was so outlandish that it was mistaken for an April Fool’s Day joke when the design was declassified on April 1, 2004.

No chickens were harmed in the making of this blog post, or in the Blue Peacock nuclear land mine programme.

No chickens were harmed in the making of this blog post, or in the Blue Peacock nuclear landmine programme.

A chicken was to be sealed inside the casing, with enough food and water to stay alive for a week. Its body heat would keep the bomb’s electronics operational.

As civilian industries started implementing these cutting edge technologies en masse, our quality of life and productivity shot up exponentially. Our TVs, cars, phones, the clothes we wear, and just about any consumer product we buy: They’re all better thanks to the biggest waste of money in history. Granted, we all have trace amounts of Strontium 90 in our bones, but in the big scheme of things, it’s a small price to pay for the high-tech world we enjoy so much.

Oh yes, we also got video games out of it. Loads and loads of video games.

Kickstarting Game Development

Video games were pioneered on the earliest digital computers (and some analogue ones as well). In fact,Tennis for Two, arguably the first game to use a graphical display, was developed for an analogue computer in 1958. However, not even Bond villains had computers at that point, so the rise of the video game industry had to wait for hardware to mature.

By the mid to late seventies, microchips became cheap enough for mass market applications. Now that we had the hardware, we just needed some software developers and a use-case for cheap chips. Since the average consumer was not interested in expensive and complicated computers that were designed for big business, attention shifted to gaming; arcades, game consoles and inexpensive computers like the ZX and C64.

These humble machines brought programmable computers to millions of households, hooking a generation of kids on digital entertainment, and creating opportunities for game developers. Consoles and cheap computers brought the arcade to the living room, ushering in a new era of video gaming, and creating countless jobs in the industry. Even the Soviets got in on it with Tetris, the first game from behind the iron curtain.

The advent of inexpensive home computers and game consoles created a generation hooked on computing and coding.

The advent of inexpensive home computers and game consoles created a generation hooked on computing and coding.

It wasn’t just entertainment. Unlike consoles, the ZX and C64 were proper computers, and geeky kids quickly found new uses for them. They started making demos, they started coding. Chances are you know a lot of these kids, and if you’re reading this, you probably work with some of them.

If you’re interested in the development of early video games, and what the Cold War had to do with them, I suggest you check out Nuclear Fruit; a new documentary that’s a must see for all geeks and gamers born in the 70s and early 80s.

These guys and gals went on to develop a new breed of video games, build successful online businesses, create new technologies and revolutionise the digital world, all in the space of a decade. A generation that grew up with the constant threat of nuclear war, enjoying dystopian science fiction, helped make the world a better place. They didn’t develop Skynet, they developed millions of mobile and web apps instead.

So, no Terminators. At least, not yet.

Cold War 2.0 And The Emergence Of New Threats

This is not a geopolitical blog, but if you happen to follow the news, you probably know the world is a messed up place. No, the end of the Cold War didn’t bring an era of peace and stability, and there’s already talk of a “Second Cold War,” or worse, a “hot” war. While most of these worries are nothing more than hype and sensationalism, a number of serious threats remain. The threat of nuclear annihilation is all but gone, but the technology we love so much has created a host of potential threats and issues, ranging from privacy and security, to ethical concerns.

Thankfully, we aren’t likely to see an arms race to rival the one we witnessed in the 20th Century, but we don’t have to. The same technology that makes our lives easier and more productive can also be used against us. The digital infrastructure we rely on for work and play is fragile and can be targeted by criminals, foreign governments, non-state actors, and even lone nutjobs with a grudge.

These new threats include, but are not limited to:

  • Cybercrime
  • State-sponsored cyber warfare
  • Misuse of autonomous vehicle technology
  • Privacy breaches
  • Mass surveillance abuses
  • Use of secure communications for criminal/terrorist activities

All pose a serious challenge and the industry is having trouble keeping up. My argument is simple: We no longer have to develop ground-breaking technology to get an edge in geopolitical struggles, but we will continue to develop technologies and methods of tackling new threats and problems. It’s a vicious circle since these new threats are made possible by our reliance on digital communications and the wide availability of various technologies that can be employed by hostile organisations and individuals.

A new generation of emerging threats is once again rallying industry leaders and governments around a common cause.

A new generation of emerging threats is once again rallying industry leaders and governments around a common cause.

Cybercrime is usually associated with identity theft and credit card fraud, but it’s no longer limited to these fields. The advent of secure communication channels has allowed criminals to expand into new niches. The scene has come a long way since the romanticised exploits of phone phreaks like Steve Wozniak. Some offer hacking for hire, others are willing to host all sorts of illicit content, no questions asked. Some groups specialise in money laundering, darknet drug bazaars, and so on. The biggest threat with this new generation of cybercrime is that you no longer have to possess many skills to get involved. As cybercrime matures, different groups specialise in different activities, and they can be hired.

State-sponsored cyberwarfare poses a serious threat to infrastructure, financial systems, and national security. However, there is really not much an individual can do in the face of these threats, so there’s no point in wasting time on them in this post. Another form of economic warfare could be to deprive a nation or region of Internet access. It has happened before, sometimes by accident, sometimes by government decree and enemy action.

Commercial drones don’t have much in common with their military counterparts. Their range and payload are very limited, and while a military drone can usually loiter over an area for hours on end, the endurance of hobbyist drones is limited to minutes rather than hours. This does not mean they cannot be used for crime; they can still invade someone’s privacy, smuggle drugs across a border, or even carry explosives. Autonomous cars are still in their infancy, so I don’t feel the need to discuss the myriad of questions they will raise.

Privacy remains one of the biggest Internet-related concerns expressed by the average person. This is understandable; we have moved so much of our daily lives to the digital sphere, placing our privacy at risk. People don’t even have to be specifically targeted to have their privacy and personal integrity compromised. Most data that makes its way online is released in the form of massive dumps following a security breach affecting many, if not all, users of a particular online service. People will continue to demand more privacy, and in turn clients will demand more security from software engineers (who aren’t miracle workers and can’t guarantee absolute security and privacy).

Mass surveillance is usually performed by governments and should not represent a threat to the average citizen or business. However, it’s still a potential threat as it can be abused by disgruntled workers, foreign governments, or by way of data breaches. The other problem is the sheer cost to the taxpayer; mass surveillance doesn’t come cheap and we will continue to see more of it.

Most governments wouldn’t bother with mass surveillance and metadata programmes if they weren’t facing very real threats. The same technology developed to keep our communications and online activities private can be abused by all sorts of individuals we wouldn’t like to meet in a dark alley. The list includes multinational crime syndicates, terrorists and insurgents. However, not all of this communication needs to be encrypted and secure. The point of propaganda is to make it widely available to anyone, and the Internet has given every crackpot with a smartphone the biggest megaphone in history, with global reach, free of charge. You can use the Internet to rally a million people around a good cause in a matter of days, but the same principles can be applied to a bad cause. If the target audience is people willing to join a death cult with a penchant for black flags, you don’t need a million people; you just need a few dozen.

The Difference Between Science And Science Fiction

For all their brilliance, the science fiction authors who helped shape popular culture in the 20th Century didn’t see the real future coming. They didn’t exactly envision the Internet, let alone its profound impact on society.

Sorry to burst your bubble, but Terminators and Artificial Intelligence (AI) aren’t a threat yet, and won’t be anytime soon. The real threats are more down to earth, but that does not mean we can afford to ignore them. You don’t need a Terminator to create havoc, all you need is a few lines of really nasty code that can disrupt the infrastructure, causing all sorts of problems. You don’t need a super-intelligent automaton from the future to cause damage. Since eBay doesn’t carry Terminators, it’s a lot easier to use an off-the-shelf drone, programmed to deliver a payload to a specific target: drugs to a trafficker, or an explosive charge to a VIP.

But these aren’t the biggest threats, they’re just potential threats: something for a Hollywood script, not a tech blog.

The real threats are criminal in nature, but they tend to stay in the cyber realm. You don’t have to physically move anything to move dirty money and information online. Law enforcement is already having a hard time keeping up with cybercrime, which seems to be getting worse. While it’s true that the crime rate in developed countries is going down, these statistics don’t paint the full picture. A few weeks ago, the British Office for National Statistics (ONS) reported a twofold increase in the crime rate for England and Wales, totalling more than 11.6 million offences. The traditional crime rate continued to fall, but the statistics included 5.1 million online fraud incidents.

The cost of physical crime is going down, but the cost of cybercrime is starting to catch up. I strongly believe the industry will have to do more to bolster security, and governments will have to invest in online security and crime prevention as well.

Just in case you are into dystopian fiction and don’t find criminal threats exciting, another frightening development would be data monopolisation: A process in which industry giants would command such a competitive lead, made possible by their vast user base, as to render competition pointless, thus stifling innovation.

Yes, I am aware that Terminators would make for a more eventful future and interesting blog post, but we’re not there yet.

Source: Toptal