top of page
Writer's pictureRaz Eliav

Demons in the Data Smoke

When the horror pictures from the Twin Towers started flocking in, I was astonished by the magnitude of nonsense that came along with it.


The internet was much younger, and we were all much more naïve.


Fake news was actually forwarded by E-mail, and I recall one that caught my attention most.


It was a “concrete evidence” that demonic forces were behind the attack.

In the billows of smoke and dust that arose from the ruins, people overlayed the outline of an evil-looking face apparent in some of the photos- once you see it, you cannot unsee it.


It took several weeks until the more elaborate demons were invented, and these conspiracy theories reverberate to this day.


Our ability (and desire) to quickly detect patterns in complex noise is what kept us safe in the Jungles, for sure, but in the absence of immediate danger, our minds can connect dots in ways that are not necessarily true, nor helpful, and get us stuck in the wrong direction.


This is especially important in drug development, where we oftentimes see faces in the smoke and need to decide if it’s a real signal, that has a mechanistic explanation or just noise.


Take a chromatogram for example.


It tells many different stories about our product’s composition- but it is not always possible to tell what that “shoulder” is, adjacent to a peak. Is it an impurity? Is it an isoform? Is it an artifact? and what happens when you overlay 10 such chromatograms, from different samples in your process?


So we run additional experiments, develop the method further and try to get some hints from other tests- but what we oftentimes get is just more smoke.

Eventually, we must assign a cohesive story to the data and accept the limits of the method (and of our conception).


In that sense, the (current) narrowness of the intelligence of AI is a blessing.


The reduction of complex reality to a “most likely” answer can be helpful for us humans in challenging what we think we see vs. what is actually there.


Combined with the ability of machines to digest much larger datasets than us, and overlay them one on top of the other- AI has the power to send us in the right direction in our exploration, even if it remains for us to decide the meaning behind it, going back to first principles.


Long before we reap the benefits of AI in the actual regulated GxP realms, we can use AI as a noise removal filter for our complex datasets, and I am eager to see what the industry comes up with.

Comments


bottom of page