Excitement and fear around AI and cognitive computing has reached fever pitch, but are cancer treatment and diagnosis where the technology will have its biggest and most positive impact yet?

“We don’t know what cancer really looks like,” says Helmy Eltoukhy, an electrical engineer from California and the founder of Guardant Health, the first company to commercialise a comprehensive genomic liquid biopsy. “We think we do, but in reality, we are using very simplistic generalised features to categorise cancer.”

Cancer is thought to be a disease of the genome. It took 13 years and three billion dollars before, in 2003, the technology to sequence the first human genome was invented. Since then, the price has fallen – it now costs $1,000 and can be done in one day.

Reaching this milestone will help unlock the secrets of cancer. However, using the large and complex data sets that sequencing produces to achieve better patient treatment and diagnosis is problematic and demanding. So far, the healthcare sector has only scratched the surface – but this is changing thanks to the power of artificial intelligence (AI) and machine learning.

Many tech entrepreneurs, alongside the big players Alphabet’s Verily, Google’s DeepMind and IBM Watson, are adopting cognitive computing to build tools to better understand cancer and other diseases. According to CBI Insights, deals to healthcare-focused AI start-ups increased from fewer than 20 in 2012 to nearly 70 in 2016.

Eltoukhy founded Guardant Health in 2013 to change this. Two years later his company launched the Guardant360 blood test to market.

Used for Stage Three and Stage Four cancer patients, the test unravels genetic sequences of a patient’s cancer to categorise its sub-type without the need for a physical biopsy, which can be dangerous and expensive. Once the mutation that caused the disease is identified, better and more targeted treatment options can be found.

To create the test, it was necessary to unlock the weak genomic alterations released into the blood from cancer that normal genomic sequencing technology is not powerful enough to detect.

To identify these signals, Eltoukhy and his team borrowed concepts from digital communication. He turned to work by his former Stanford University professor, John Trophy, who invented the digital subscriber line (DSL), which enables 100 times faster speed with the same copper phone lines that dial-up modems use.

“The trick there was pre-coding the information that is sent on copper phone lines and decoding on the backend – essentially making information more robust and resistant to noise being introduced on the phone or on the bandwidth property of the phone line,” explains Eltoukhy.

“The challenge in DNA sequencing is instead of putting 0s and 1s in a ‘dry’ form into a copper phone line or communication channel, we are putting four wet letter ACGT molecules into the DNA sequencer,” says Eltoukhy. “They are drying out and the challenge was translating a lot of the concepts of digital communication theory into biology, using the same algorithm to lower the error rate by over a thousand-fold.”

As the team started processing thousands of samples, just like algorithms in digital communication, they improved. A study of the technology found common disease-driving mutated genes detected by Guardant360 in breast, lung, colorectal, and other cancers were also present in 94-100 per cent of the solid tissues extracted from trial participants.

The test is now considered to be one of the most comprehensive on the burgeoning liquid biopsy market, looking at mutations in 70 cancer-related genes.

Guardant has processed over 15,000 liquid biopsies and 50 tumour types to improve performance and has helped doctors discover the presence of cancerous tumours before they have presented symptoms.

The technology can easily and painlessly monitor cancer, which is always evolving and changing, so treatment plans can be adapted accordingly. To monitor lung cancer, for example, a biopsy is necessary, which in the US costs $14,000 per patient and has a 19 per cent complication rate.

Once cancer signals are unlocked, at an extremely high fidelity for each type, they are classified and grouped into different sub-sets of cancer. As specificity and sensitivity gets better, the company will classify more sub-types of cancer and start decoding and unlocking signals for Stage One, Stage Two and recurrent cancers.

There is no standardised approach to the handling of quality data from genome sequencing – it differs from one hospital to another. Yet if not managed correctly, data can become corrupted, making it extremely biased or inaccurate.

In 2011, Swiss-based data medicine company Sophia Genetics set out to ‘develop an algorithmic technology that would make genomic data more accurate for diagnostics’, essentially cleaning and standardising it.

The company developed an analytics platform called Sophia DDM and an AI called SOPHiA to help hospitals overcome the bottleneck of analysing complex data generated by genome sequencing. The user in the lab loads raw encrypted data into Sophia DDM platform, the AI transforms gene data into digital information and ejects anything that could bias or make it present false results, usually a major problem in diagnosis. It then physically deciphers the molecular profile of the cancer or patient and annotates information.

“The only way you can correct the data is by asking about its production. This is the secret of SOPHiA,” says Jurgi Camblong, CEO of Sophia Genetics. “The AI has been exposed to a lot of bias so it can infer information from noise and run and trap that noise so the outcome of the data is accurate – it is not going to tell you something is there when it is not,” he says.

The company worked closely with hospitals and was routinely exposed to problems, which it worked to solve with the algorithm. The more problems it overcame, the more the algorithm learned and now the AI doesn’t need any further data to be accurate. The service is charged on a per-patient basis to make it more affordable for clinics.

The more data SOPHiA processes, the more it learns, as information is continuously shared anonymously on the platform to help clinicians advance their expertise and make better decisions. Knowledge sharing is key, says Camblong, to ensure that the expertise of a specialist in London can be accessed to save patients somewhere else.

Cognitive computing can also be used to better decipher and diagnose cancer and diseases from the hundreds of thousands of medical images taken across the world daily, reducing the burden on doctors.

A few years ago, Eyal Gura, co-founder and chairman of Zebra Medical, suffered a scuba-diving accident in Mexico and was cared for in what he describes as a ‘mediocre clinic’. After eight or nine scans, the physician still couldn’t diagnose him because he wasn’t qualified to interpret them.

This gave Gura, who lost his young brother to cancer, an idea to use computer algorithms to quickly and accurately identify diseases in medical scans. He established Zebra Medical Vision, which counts Marc Benioff, CEO of Salesforce, among its investors, and set a target of creating 100 algorithms for 100 diseases.

“Two billion people are joining the middle classes worldwide and will have medical scans, but there are not enough doctors to diagnose those scans. We want to get help from our friends, the computers, to do this,” says Gura.

Through a partnership with the HMO, Israel’s national health service, Gura gained access to 15 million anonymous scans, including diagnosis information and treatment outcome.

Before being used, the data had to be cleaned, curated – the malignant versus the non-malignant – and combined with biopsy results. Using deep-learning techniques, the company has built 11 out of 100 algorithms to identify certain pathologies.

The technology is based on advances made over 15 years ago at Stanford University, when scientists taught computers how to detect objects in photos.

Last year Zebra launched an algorithm for breast cancer detection after processing 350,000 anonymous mammogram scans and their biopsy results. It recently reached the milestone of being able to identify breast cancer at a higher accuracy rate (92 per cent) than a radiologist using computer-aided detection software (82 per cent). The algorithm is now going to regulatory approval in the US and will be released in Europe soon.

Once the company, which was recently listed 20th on the CB Insights ‘AI 100’ list of the 100 most promising global AI start-ups, reaches 100 algorithms, it plans to introduce a radiology assist tool for physicians around the world. Initially, this will help to improve capacity of doctors tenfold by enabling them to put aside normal cases and focus on unusual or urgent ones, says Gura. For example, using Zebra’s algorithm, tiny brain-bleeds, which may be indicative of common stroke, will be highlighted instantly so doctors can prioritise these patients. Prevention is a major advantage of the technology.

Zebra has recently partnered with Dell Services (now NTT Data), which provides medical imaging storage for over 1,000 US hospitals and currently has over 160 million scans stored in its archive. The company will use Zebra’s Analytics Engine as an added-value service on top of its archiving services to help customers identify patients at risk of certain diseases.

“The HMO in Israel ran our osteoporosis algorithm on 196,000 CT scans and discovered that even if they stopped conducting dedicated bone density exams and only used our CT bone density algorithm, they would still identify 50 per cent more people with osteoporosis than they do now,” says Gura.

Many others are exploring machine learning for medical imaging diagnosis. Researchers at Stanford University have created an AI algorithm that can identify skin cancer, while in the UK DeepMind is mining through medical records at London’s Moorfields Eye Hospital to analyse digital scans of the eye to help doctors better understand and diagnose eye disease.

Most AIs are focused on standard diagnosis and finding better treatments because data sets are easier to obtain and quantify, but companies are working on creating algorithms for early detection and prevention to change how healthcare is delivered, making it preventative rather than reactionary.

Companies are only beginning to scratch the surface. As knowledge evolves and is shared securely on platforms like SOPHiA, more actionable things can happen, such as identifying new sub-types of cancer, or scanning the population more quickly for hereditary and preventable diseases.

Guardant Health’s Eltoukhy believes that cancer will eventually be managed in the same way as HIV – ongoing and personalised, adapting treatment as the disease evolves.

“The big challenge with cancer is it is a moving target,” he says. “It almost shouldn’t be called cancer in the singular, it should be called cancers.”

Guardant is working with a number of academic and healthcare institutions on Project Lunar, which aims to develop a highly sensitive and specific test to identify multiple types of cancer at an early stage by detecting mutated DNA fragments in blood samples from high-risk individuals. Thousands of patients will be enrolled in trials to demonstrate first the feasibility and then the efficacy of early detection.

There are barriers to using AI in healthcare: it can be an extremely tedious job dealing with such large data sets, and earning trust to handle data securely is a challenge, requiring the highest standards needing to be upheld. Moreover, technology is evolving so fast that regulators struggle to keep up. Yet there is little doubt data coupled with computers will transform healthcare.

“By 2025, AI systems could be involved in everything from population health management to digital avatars capable of answering specific patient queries,” says Harpreet Singh Buttar, analyst at Frost & Sullivan.

Genome sequencing has provided an inroad to understanding cancer. Data, machine learning and AI will give doctors the tools to better use this information to save some of the 8.2 million lives it claims yearly.