This is first 19 lines from the famous "Decree of Canopus", as found on the stella at San Al-Hajar ("Tanis Zoan" site), in 1866, by Prof. Lepsius, Reinisch, Roesler & Weidenbach. Detailed translation of the Egyptian hieroglyphic writing was possible, because the stella also had a Greek inscription. There is explicit meaning here - even if you might not be able to translate it initially. You can determine just by looking at it, that it is a written language.
The trick with Neural Networks and their use in Artificial Intelligence applications is to extract meaning from what may a first glance, look an awful lot like random noise or a low-information-content image. The critical data item may that is of the most importance, is often obfuscated or disguised. It's importance is only evident later, as the system evolves thru time. But seeing it early, and/or clearly so that the meaning can be extracted, can make the difference between success and failure - even between life and death.
If there is something there, with enough careful assessment, we can usually decode it. But if the data really is almost all random noise, then the decrypt exercise will be not just be difficult - it will probably fail. But that information is also of real value.
Neural Network technology ("Deep Learning" computer programs, as the MSM calls it) has been successful at reading CT-Scans, and detecting tumors. Here is what a CT-scanner looks like. A large wheel with an x-ray emitter and receptor rotates and takes a series of photographic slices, which can be assembled into a 3-D picture of the inside of a live human body or part thereof. Radio-opaque dye, made from a compound of iodine is injected into the person, so the image shows up *much* clearer than would be the case, without the dye. Your kidney filters out most of the dye in less than 1 hour. As the dye is injected, your hands and feet may feel warm, but the feeling passes quickly.
This is a big deal. Neural network technology (called "deep learning" in the popular press), can be shown to be very good at image processing. The initial experimental results of using it on tomographic scans of lungs, to detect cancerous tumors, has been so successful, that the occupation of "radiologist" is now being re-thought. The big CT-scanner will soon not just take the tomographic picture-set, it will also tap into a database of *all* image-sets, and indicate if a tumor is present. Last week, I had a CT-scan, and I (and my doctor, I trust) are awaiting the results of the image evaluation by the hospital radiologists. CT scanning technology has become a topic of interest for me now.
Stunning example of what is now possible, with advanced, hi-resolution CT scanners. The image produced on the right, courtesy of Univ. Medical Center, Gottingen, Germany, shows an example of what is termed "cinematic" rendering of a tomographic image. Compared to the 1974 vintage image, this is a real leap forward. What is significant, is how neural-network technology is being used to process CT scans, and determine if tumors are present. This is perhaps one of the most promising immediate uses of what are termed "deep learning" neural network techiques. If the tumor is present, then by definition, the image data that describes the tumor is also likely to be present, given that sufficient resolution is possible.