Robert Goddard's 1920's rocket-launch pad, with insert of my little Android APLSE app, estimating a probability density on a time-series, using a Samsung Tab-A, running under Android 6.0.1 (Marshmallow).
If you compare Goddard's 1920's pad to a late 1990's space shuttle launch facility, you can get an idea of how GEMESYS Ltd compares to DeepMind, a division now of the Google/Alphabet complex. DeepMind folks have just open-sourced their Sonnet tools, which sit on top of TensorFlow. I read thru the DeepMind stuff, with the unpleasant sense of "Oh crap, we are screwed..." But then I read further, and maybe we might still have a chance... It was certainly nice of the DeepMind folks to release their Sonnet code to open-source.
The little Samsung Tab-A, running Android 6.01 and APLSE with one of my workspaces, is my first cut of a prototype Augmenter. It's really primitive, but it works, and helps with getting the business done. I have this analytic stuff running on the iPad now, also.

Field Notes from Lorcalon Farm

APL on iPad & TensorFlow, Xerion & the Helper-AI's

GEMESYS Ltd. is the name of my consulting practice.  We do research and analysis in science and technology, with a view to learning, teaching, and helping.  And we look for special economic situations that work.   GEMESYS Ltd was established in 1981, and continues to offer research and consulting services to address unique requirements.  We operate from Lorcalon Farm, in Canada.  (The image at right was made using the laplace partial differentiation simulation example from Google's TensorFlow tutorials. )

Why Do Datascience? & Why use AI?

Since the 1990's, I've done data-science related work under the radar, as it were.  I've even built amplifiers and radios to learn about feedback processes.  (Building and tuning an actual, physical device teaches one so much.  The math of it gets into your fingertips...)   I read George Soro's stuff on "reflexivity" in the markets (circa 1980's), and I think I am beginning to understand why "technical analysis" actually works.  We used to think it was because it captured the behavioural economic features of humans (cf. Amos Tversky, Daniel Kahneman, Richard Thaler et al), but now I think there is more there.  If you need to make money using the markets (ie. to pay your bills), you either go broke, or you end up using some form of technical analysis (or, you become a portfolio manager, take a percentage of the assets, and you don't care what happens, as long as you can keep your clients.)  But now, there is hard-core datascience, which lets many different ideas to be looked at all the time.  Having a good AI helper, with statistically significant results associated with its predictions, I suspect can give one an edge, even if much of the data one encounters is mostly wild randomness.   As a lone-wolf in private practice, you either have a verified edge, or you are quickly carried out, and fall into the abyss.  And it seems AI can give you an edge.  [Mar. 31, 2017.  Well, I guess it's confirmed:  US-based Blackrock, one of the biggest investment funds on the planet now, with $5.1 trillion in assets, has announced that it will sack a bunch of its human stock-pickers, and replace them with *robots* - the term Wall Street uses for AI-driven investment strategies.  Source: Wall Street journal article, Mar. 28, 2017.]

The image shown further down this page is a chart showing a stock price, as of Feb. 1st, 2017.  The indicated  price trajectory was so extreme, that I thought the model was just broken, and as I was out already, I did not get back in.  But my pure technical model said, yes, go long. (Full disclosure: I still didn't...)   By Feb. 21, the price was up, and was tracking the forecast price pathway.   With a complex model, you don't know what it has captured - but it appears that perhaps it did capture something in the data, given the extreme curve divergence.   So, I realize, I must investigate further... Blush  [update:  As time goes by, I just keep getting more evidence of how any *model* is going to be successfully gamed by the market.  You don't want a model, you want an old, experienced guy to offer some gentle advice.  Since there is no such guy - a *very* well trained AI might be the next best thing, perhaps?]  [Even more disclosure.  Went long, day before CEO announced $4.9 acquisition in the USA, which caused 117 stock to downtick to 114.  <Sigh>.  But the acquisition is probably a good idea, all things considered... and I use no margin, just want to drink from the dividend stream. ]

Status Log (TensorFlow/Xerion work):

[Apr. 09, 2017] - Google's "DeepMind Technologies" group in London have just open-sourced their "Sonnet" product.  This might be a big deal.  Sonnet sits on top of TensorFlow, and lets it be used to create complex neural networks more easily.  I am interested in trying it.   I've had a large tooth removed, have more dental work scheduled next week, and have to do a lot of tax work to file personal and corporate income tax and HST forms for farm and firm. Dealing now with pain...  [Update:]  Just went thru the *DeepMind* website, quick scan of their Github stuff, & read their paper on "Learning to Learn".  Imagine if US rocket-pioneer Robert Goddard was transported to a 1990's launch of the space shuttle - that's how I feel after this quick scan. These guys look like they own the AI field now, especially since they have the resources of Google behind them now.  They look to have infinite power, both in CPU cycles and cash!  Oh my... Crying

My only chance here is these guys like chess I *hate* chess with a passion - as I detest most closed-environment gamey stuff.  Game-playing is time wasted.  All the interesting stuff and the stuff that matters - that makes a difference to the future, and drives humanity forward - lies in the open, *unbounded* realm of the pure real - the place where neural networks often collapse and fail badly, typically.   But you can use NN technology to *augment* human intelligence - like lenses can help your eyes see better, amplifiers can let you hear better, and computers can let you organize and process information better. (And yes, like a M1911 .45 can be used to punch a hole thru your advisary better than your fist can - let's be honest.)  In a formal system that is tightly-bounded by rulesets, and distributions are known, a well-built AI will *always* win.  What about open scenarios, where there are no formal rules, and the rate and intensity of change itself is also dynamic?  Can an AI help?   I am pretty sure it can.  And I think I know what it has to be able to do.  The AI does not replace or overwrite the human agent, it augments his ability, and lets him make better decisions, quicker, and with less of the errors that behavioural economics shows us *really* do occur.   I'm not in this for the money.  I want to prove a point, more than anything, and build a device.  We need AI technology like soldiers need rifles.  This technology could aid us all by letting us make fewer mistakes, and avoid the "Wee-Too-Low! / Bang-Ding-Ow!" outcomes that are become increasingly common in our modern world.  Perhaps I still have a chance... Blush (I put a picture of my primitive Analyzer tool output, essentially a first cut of the Augmenter I envision , running on a Samsung Tab-A, under Android 6.01, at screen bottom.  It shows the M-wave Analyzer output, calculated and displayed on the Samsung tablet, and an estimated probability density, which suggests trade size for a given risk level.  It essentially suggests how big you should bet, given the risk level you want to accept, and shows it all as a picture, so you can see exactly what you are dealing with, given the data-range you believe is appropriate for the current picture-of-the-world your necktop neural network tells you is now in play.  You can see where I am going with this, yes?)

[Mar. 31, 2017] - Got Xerion running with original late 1990's data (Dmark segmented time-series network).  Ran with many different types of training - confirmed it all works.  Xerion looks to be predecessor product to TensorFlow in many ways.  Using simple steepest  descent (standard backpropagation), with fixed step and epsilon of 0.1 it can take about 90,000 itereations to train down to the noise in a segmented timeseries. But use a line-search, and conjugate gradient with restarts, and you can get to the same level of training (essentially, just overfitting a timeseries to check limiting case of training algorithm), and Xerion will fit to the curve in about 300 to 400 iterations.  It's a pretty dramatic difference.  My original approach was quite wrong (using a single time series, segmented into cross-sectional training cases).  I have a new idea, based on current practioner methodologies, that looks to be much better.   I'm having arguments with a PhD type, who thinks NN tech is useless for market phenomenon (he is a "random walk" believer, it seems), but given modern state of the NN art, I am pretty sure my new approach can be useful.   I note with interest that Dr. G. Hinton (Xerion & TensorFlow AI academic guru), and Edmund Clark (former CEO of TD-Bank in Canada), will be setting up a new gov't funded "Artificial Intelligence" Institute in Ontario, based in Toronto.   Two new charts at page bottom - as Ghostscript image of the original Xerion-driven DMark series (raw price data scaled to fit between 0 and 1) training versus network output, and today's Cdn dollar chart - showing the complete NON-RANDOMNESS of the modern markets.    Markets are not random, they are chaotic.  The "random walk" picture of the world, where you believe in stable distributions, and build models that use distribution-tails to estimate your risk is wrong.  It has already given us the 2007-2008 financial meltdown.   Today, the Cdn-dollar chart looks like the output from a square-wave generator.  It's not andom.  It is just one example of many that you can see *every day* in the markets. 

I've been stepping thru backpropagation by hand, using basic partial differentiation calculus, and the chain-rule, just so I can clearly understand the original idea.  I learned some C++ also.  Downloaded Alan Wolfe's NN sample code, only to find it won't run on my Linux CentOS boxes, with gcc 4.4, because of some new loop-construct recently invented and slotted into Clang or LLVM or whatever the heck the kids are now using - something from C++ 11 or 17 or Apple's lab.  More reading to do. This project is taking on a life of its own.

[Mar. 24, 2017] -  Completed prototype of neural network definition and activation routines in APL on iPad.  Great having a working spec - trivial Xor2 net - can train it on Xerion, and activate/execute the net on iPad using APL (which is great for matrix stuff).  See page bottom for picture.  Numbers match, Xerion in Linux, iPad using APL, for trivial toy case of Xor2 network.

[Mar. 17-20, 2017] -  Working on "cross entropy" idea, which drives how artificial neural-networks are trained.  The idea is that the initial (actual) probablility distribution is mapped, by the artificial neurons in the network, out to a posterior target distribution - and that there are different entropy characteristics across the various possible target distributions.  One seeks to minimize the "Kullback-Leibler divergence" or the entropy difference between the initial and the posterior distributions.  This sounds quite complex, but if you are using "one-hot" encoding (for example, trying to identify written digits), and your initial disribution is simply "0 0 0 1 0 0 0 0 0 0" - ie. your number is a "3", then the cross-entropy summation of the initial probability distribution values times the posterior generated distribution - boils down to taking a single natural logarithm of the sigmoid or logit value (ie. the probability-like number between 0 and 1)  that the network generated.    You can use a gradient descent search to drive your back-propagation, but the "stopping point" of the network training will be when all the cross-entropy values between the initial and posterior probability distributions are as small as possible.    It should be possible to make your network "recognize" with a high level of accuracy.  This recognition can extend to more than just written digits.   One should be able to create an artifical "Helper", that has superior recognition ability, for whatever you train it for, given you can "show" it enough accurate raw data - what we used to call "training cases".   I suspect "Helper AI" technology might become a must-have tool as we move into this brave new world.  (I really wanted to get a TensorFlow AI running on my iPad.  My vision for this was Issac Asimov's "Foundation" series - where Hari Seldon had this "probability calculator" at the first chapter, set on Trantor.  I can't get Numpy to load thru to Python yet on the iPad, but looks like Xerion might work...)  I am thinking of asking a Japan company to design a special Hyper-tablet device for me - but running *pure* Linux, no Android or iOS stuff in the way... 

[Mar. 14-15, 2017] - Fell down a big rabbit hole. Decided to look at my old Xerion stuff, and got obsessive about it and decided to convert 20 year old Uts/Xerion to run on a modern Linux box.  Xerion was the U of Toronto product built by Drew Van Camp and others, offered by Dr. Hinton's group to Canadian industry, as it was funded by a gov't grant process.  I took it and ran, and built a Linux box using Slackware Linux just to get Xerion running, and build some neural-nets to investigate time-series data.   As I dug deeper into TensorFlow/Python, I realized it looked a lot like UTS-Xerion/Tcl/Tk+itcl+Tclx - which I know well.   Learning is all about jumping from one level to another.  Getting Xerion running on a modern Linux has been a bit of work. (Just getting a clean compile of the code using a modern gcc was non-trivial) .  But I can run the original Xor2 example and it all seems to work well.   Having Xerion running will be very useful, as I can verify TensorFlow builds against original Xerion efforts.  Xerion is not convolutional, but it did offer a number of alternatives to basic gradient descent, which - in the example of training a boolean net like the Xor2 example - can be shown to be useful.  It's also a good learning tool, with nice visualization.  (Screen shot of Uts/Xerion is below..)  (Mar.15:  Fixed a bug - Network Node Unit & Link Display not working, fixed.  Built Xerhack, a visualizer toolkit uses Tk Canvas.)

[Mar. 8, 2017] - Got image hacking stuff working in Python on both Mac OSX and Windows.  Took the Ripples-in-Pond Tensorflow example, and made it look more like exploding stars in a dark star-field.  Runs *without* IPython, Jupyter and Python Notebooks (displays 5 images in sequenece as .jpg files, uses SCIPY and Pillow version of PIL (the famous Python Image Library)).   Images are interesting - like a star-field evolving over giga-years (see picture above.)   Here is part of the code:  (Click "Code" in top menubar for the rest of it...  Big Grin)

    # --- the Tensorflow LaPlace Image example (Uses PIL, and scipy.misc)
    # --- Modified: Mar 7, 2017 - by MCL, to just use image file display
    # ---                                       instead of Python Notebooks, IPython, etc.,
    # ---                                       with negative damping and darker image backgrd.
    # ---                                       (Instead of ripples in a pond, we have
    # ---                                       exploding stars ... )
    # --- Produces Initial image, 3 intermediated images, and the final image
    #     as .jpg files. Requires only: tensorflow, numpy, scipy and Pillow
    #     and Python 2.7.10.
    # --- This example taken from Tensorflow Site:
    # ---                           
    # --- and provides a nifty example of manipulating n-dimensional tensors.
    # ---
    # --- For Python newbies (me!):   1) invoke Python in terminal shell
    # ---                             2) >>> execfile("")
    # --- focus on understanding exactly how Tensorflow is reshaping tensors
    # ------------------------------------------------------------------------------------------
    # --- Import libraries for simulation
    import tensorflow as tf
    import numpy as np

    import scipy.misc

 <<< The rest of the code is in the "Code" section. Just click on "Code" on top menubar >>>



[Mar. 1, 2017 ] - As mentioned previous, I have Tensorflow + Numpy running on Python on the MacBook OSX now, and have got TensorBoard to show node names finally. This is the first trivial W = m * x + b (Linear Regression) program one can run, using gradient descent method to do the least-squares regression line. I've updated the two pics showing TensorBoard's display of a process graph for linear regression (now with variable Names!), and the Python+Tensorflow code example.  I've also posted these to the GEMESYS Facebook site.  Next, I want to 1) create a very simple neural network, and 2) read a real data data file of training cases, and produce some real output to a file. There is a lot of useful information on StackOverflow and various websites built by clever folks.  I've learned a bit just reading the StackOverflow queries.  I was sold on the NN methodology in the 1990's.  Xerion used Tcl/Tk to provide visualiztions, which I used to develop in (and still use!), but I typically ran my networks in background mode, and used GNUplot and APL to chart the prediction curves.  I have these old C programs I used to chop up data series, and I am itching to drop some of the old training files into a modern Tensorflow net.

[Feb. 24, 2017]  - Tensorflow is a bit more involved than Xerion, Prof. Hinton's U of Toronto product from many years back.  Here is my first hack, getting the basic tutorial running, with a trivial linear regression, and viewing the graph in TensorBoard, which one does using a browser session to localhost, port 6006.  To get the graphic working,  you slot in the statement "writer = tf.summary.FileWriter('/Path/to/logfiles', sess.graph)", before you run your training.  This writes event log data for model structure to TensorBoard log file directory, and the visual image of your model to be generated.  Very, very cool.  I put two images at *very* bottom of page, one showing the program text for my modified version of the TensorFlow "Getting Started" tutorial with simple linear regression model Y = m * X + b, and the generated TensorBoard model structure image, which is viewed using Firefox browser on the Macbook.

[Feb. 21, 2017]  - Ok, got it. Finally got TensorFlow installed and working. Gave up on the Linux box, as it runs some production stuff on news-articles that I need.  Used the Apple MacBook Pro with Yosemite (OS X 10.10.5), which had Python 2.7.10.  Was a complex project, but got it running.  Apple had Python 2.6 running by default, and I had installed Python 2.7 with "numpy" (the scientific numeric package for Python - its just the old Fortran math libraries, which I used to use at Treasury for bond-math calcs and econ-research).  Had to get the Python "Pip" program working, and first install of TensorFlow with Pip smashed everything, due to a flaw in pyparser stuff.  Had to manually fix a Python prgm called "" in directory /System/Library/Frameworks/... tree, as well as disable the original "Frameworks" located "numpy" and "six" modules.  This was critical.  The TensorFlow Python-pip install caused pip, easy_install, and the lot, to kack fail bad.  And the Frameworks directory tree Python modules (some Apple standard?) caused Python to always load the old Numpy 1.6 and six 1.4 versions - and TensorFlow needs 1.12 Numpy and Six version 1.10 or higher.   Until I fixed the "" parser stuff, and disabled the Apple-located default numpy and six, TensorFlow complained about wrong versions. What is silly, is that "Pip" (the Python Install Program), drops the updated modules in other dir, and until the ones earlier up the path are removed (Eg. from numpy to  numpy_old), Python keeps loading the old ones, even after one has run pip and/or easy_install, to load in the new ones.  I put a note on StackOverflow and posted bug and the fix, on Github/Tensorflow, search for Gemesys.  Bottom-line, is I was able to run the baseline TensorFlow tutorial, and make it print 'Hello TensorFlow!'

[Feb. 19, 2017] - I hate Linux dependency issues. Tensorflow requires glibc 2.14 and my CentOS 6.6 box has glibc 2.12, etc. etc...  TensorFlow wants Python 2.7 (or 3.5), but CentOS 6.6 is default Python 2.6.6, which "yum" needs to work, so I have to try virtualenv, or whatthef*ckever.   I've tried several tricks to get TensorFlow running, but no luck even on the Linux box.     I had hoped to put some datascience stuff on the iPad.  I have APL running, and GNUplot can do non-linear regression, but I was hoping to make a neural-net that could be trained on a GPU-Nvidia type Tensorflow box, and then just run on the iPad.  So far, no go.

[Jan. 27, 2017 - Started working with Tensor Flow, trying to doing some gradient descents across a custom phase-space.   I attended Jeffery Hinton's Xerion lectures at UofT back in the 1990's, and I built some neural nets using Xerion NNS to backtest commodity markets.  They worked, actually, and I had planned to publish a page on Xerion and Tensor Flow...  but I got very ill - some kind of flu thing which involved a 'cytokine storm'.   I'm recovered now, but it was touch and go.  Wanted to publish a page with a running Xerion net (or Tensor Flow example) being back-propegated, on the iPad.  Apple is a serious monopoly, and AI is real and perhaps dangerous.  The idea is to have a hand-held device that can provide real-time decision-support, but is not connected to any data link - what used to be called "air gap" security.  [Note: It is estimated that more than 70% of all trades on equity markets now are algorithmically driven.  If built right, they provide a real edge. ]  For info on air-gap security, read Bruce Schneier's piece here:    The Dow 20,000 thing is a bit of a concern.  There may be too much digital cash floating around.  Historically, the markets have been very effective at removing excess wealth.  If interest rates move up quickly, equity markets could fall 20%.  That is DOW 16,000, and it may happen at "internet speed".  The current stability may be a dangerous illusion, as powerful forces pull our world and its markets in divergent directions simultaneously.   ]

[ Dec. 13, 2016 - Got "DOSpad Math" compiled and deployed successfully to iPad 2, using Xcode 6.2.3.  Insane work. Also, updated "Time Travel" page with Harlan Ellison montage. (Click "More" button on top line right to show "Time Travel Research" page) ]

[ Dec. 7, 2016 - OpenWatcom Fortran77 on the iPad  - details ]

[ Nov. 28,2016 - Included info on how to get Python 2.7 running on iPad ]

[ Nov. 03,2016 - Added page: How to put VLC 2.1.3 on iPad-1 running iOS 5.1.1 ]

[ Oct. 23,2016 - Added page on "GNU gcc" = How to compile & run a C program on iPad ]

The Hack Which Launched this Site...

I put this website together after I hacked my old iPad, and felt I should publish the method, as it turned the old device into a very cool experimental platform, and a surprisingly useful research tool, as it is possible to obtain most of the Unix/Linux utilities from Cydia, and configure Safari to be able to directly download viewed content (eg: videos, .PDF files of important documents, etc.)  As well, there are application hives, or "repos", which offer very useful utilities, such as "iFile", which allow navigation of the native file system.  (One uses Cydia to add "sources", such as "" and "" to gain access to these additional applications).   (Further, if you use static IPv4 numbers on your local WiFi-enabled LAN, you can seemlessly transfer files between the iPad and either Windows or Linux machines.)

I've provided detailed instructions for "jailbreaking" the original iPad.  Once the iPad was opened up using the "Redsn0w" software,  Cydia was used to obtain *root* access to it.  It is our belief that *root* access should be provided to all devices owners, if they request it.  ("root" is the User-id that provides full, administrative control in any Unix/Linux system.  It is like the "Administrator" account in Windows.)  It is a lawful act to obtain this access - known as a "jailbreak" - for any device which you own.   And by doing this, you can open up the range of applications and technologies that the device can address, regardless of the restrictive trade practices that device makers employ to limit the capability.

Once the iPad was unlocked, and SSL and SCP were configured and made available, I was able to install sAPL and APLSE on it.  I also installed Borland 3.0 C++, and compiled the Blowfish encryption algorithm, to confirm that DOSpad (the PC-DOS emulator available for the iPad) behaved correctly.  The generated .EXE files for Blowfish on Android with gDOSbox, Windows XP/SP3 CLI (Command Line Interface), and those compiled on the iPad under DOSpad are all isometric. 

I've also built and deployed thru the Google "Play Store", some interesting apps on the Android platform.  These include gDOSbox, GNUplot37, and several APL interpreters.  The Android software is experimental, and does not contain any usage tracking or in-app advertising.  I did this project mainly because I wanted to run a real APL on a tablet, as APL was the first language I learned, at University of Toronto and Univesity of Waterloo. 

APL was (and is) unique in that it provided real-time, interactive computing, before the advent of personal computers and tablets.  Ken Iverson, the inventor of APL, originally developed the language as a notational tool to express algorithms.  IBM then took the idea, and built the interpreter.  Personal computers - which ran only APL! - were developed and sold by IBM in the early 1970's. (A prototype was made available to some clients in 1973.  It was a complete personal computer - called "Special Computer, APL Machine Portable" (SCAMP), and it ran APL.)  For those of us involved in computing in those early years, APL was the only real-time, interactive computing environment, and it was the first desktop, personal-computer system, as well.

So I just had to put APL on these little tablets. Big Grin

The website here is a work-in-progress.   It consists of:

  - APL on an iPad  - the notes on how to hack the iPad, and open it up to installation of non-Apple/iTunes software.   Also includes a link to my github site, where a zip file of the P/C version of sAPL files can be obtained.  sAPL is freeware, and can run in "Cmd" shell on any Windows machine, as well as Android gDOSbox, or iPad DOSpad.  (See below)

  -  GEMESYS Apps on Android - just a short summary.  This software is experimental, and is provided primarily for educational and recreational use.  Google keeps changing Android, and this makes the Android environment fragile and unstable.  Note that if you are running Android Lollipop or Marshmellow, you will need to download and make as the default, the "Hacker's Keyboard", to use the GEMESYS Android apps now, as Google has altered Android system keyboard operation.  (See below...)

  - Fractal Examples on iPad using APLSE  - I show two recent images generated using APLSE running on the iPad. (Also down below...)

  - GNU gcc & Python 2.7 - How to Compile & Run C programs natively, and install Python  - Application development for tablets typically involves IDE's and a bunch of stuff to fabricate vendor-locked packages.  With a *jailbroken* iPad, you can load GNU gcc onto it, and develop and run C programs right on the device. The underlying iOS is very unix/linux like, and can be used effectively on its own, as a fully functional computer, once tools are made available.  Python 2.7.3 can be installed also. (First button, top line)

  - OpenWatcom Fortran-77 - How to run Fortran on an iPad - This is another DOSpad trick, where OpenWatcom Fortran77 is shown configured and running on the iPad. 

  - How to Put VLC on iPad-1 - Apple will not let you access the older versions of applications from their iTunes/iStore.  They want you to buy a new device - each year, it seems.  But if you jailbreak your iPad, you can get the .IPA file from the VLC archive, and install it with Install0us.  VLC is fully open-source, and will let you watch downloaded .FLV (Flash Video) files.  VLC 2.1.3 for iPad-1, running iOS 5.1.1 is Taro-approved.

  -  Pictures from Space - I have a research-interest in Chaos Theory, and fractal geometry, turblent flow, and so on, with specific reference to the Capital Markets.  Images from  space show astonishing variety of fractal examples.  The recent Juno probe shows amazing images of the turbulent flow of the atmosphere of Jupiter. (Second button, top line). The ISS also shows wonderful space-views of our home-world.

   -  Economics and the Stock Market.  (What I studied (officially) when I was at school).  And since we pay the bills as much by our investment results, as by our consulting efforts, the markets remain a constant and critical focus.  I will try to note some useful observations here. (Third button, top-line)

  -  Statistics & The Null-Hypothesis.  A very great deal of what is written about statistical methods, and the mathematics of data-science oriented research, is either incoherent or incomprehensible.  I ran across this well-written note, and before it is vandalized by professional statisticians who seek to raise the barriers to entry to their dark-arts, I thought it should be preserved.  I will try to add some clear examples of actual research.  I used to use SPSS, SAS and R.  Awful stuff, but data analysis can yield wonderful dividends, if it is done right, and you understand *exactly* what you are doing.  (Button 4, top-line)

  -  Hausdorff (Fractal) Dimension Examples and Explanations - lifted from other websites (which may change).  The examples and explanations are good, and I wanted to preserve them. (More button / top line)

  -  Images and notes on Time Travel (Why not?  It's my site!) {#smileys123.tonqueout}And who does not love the idea of Time Travel?   We are all time travellers, aren't we?  The past offers us insight, and the future, opportunity.  But what will the future hold - pleasant dreams or our worst nightmares?    (More button / top line)

Any comments or questions can be addressed to gemesyscanada < a t > gmail dot com.  (I spell out the email address here to limit the spam robots from mail-bombing me.  I trust you can understand the syntax.)

  -  TensorFlow Work.  This is my latest thing, and I hope to use this new (old) technology to pull together a number of threads, and get to a better method.  If Thaler's work is right (based on Kahneman and Tversky), my weakness and deep loss-aversion will just keep me from taking action, when it is needed most.   It appears one must effectively automate all investment activity, if one is to have any chance nowadays.  The low-return world demands it, as do the AI/algorithmic-driven modern markets.  One cannot fight the world - one must dance with it. Wink

APL on an iPad

I have hacked my iPad Gen-1, and have loaded sAPL on it.  This was the APL product I originally released on the Blackberry Playbook, and remains available for Android devices, from the Google PlayStore. (A Windows Cmd-shell and/or DOSbox version avail. from Gemesys Github account, as a .zip file.)   sAPL is a P/C version of the original IP Sharp APL mainframe product, which ran on IBM 370's, and Amdahl V8's.  This iPad version, running under DOSpad, provides a workspace just over 300K.  It is a small, but reliable, implementation of a full APL.

The hack involved jailbreaking the iPad with Redsn0w, which then installs the excellent Cydia environment, which allows installation of all the Linux/Unix utilities, including OpenSSH, related network utilities, and various Linux/Unix utilities such as PSTree, Screen, VIm, etc.   

RedSn0w details:

From the above site, navigate to page bottom, and download the file: "redsn0w 0.9.15b3" for Windows, and note that it needs to be run in Administrator mode.

The first step is to have iTunes installed on your Windows P/C.  You might have to locate drivers for the iPad.  If you have an iPad, you probably have iTunes installed on your machine already.  If not, you have to find it and install it for the jailbread to work.  Note, this process is *completely at the user's own risk*.  That means you should back up anything on the iPad you don't want to lose.  The responsibility for following these steps lies with you as the experimenter.  (This is why I used my old, first-generation iPad.)

The steps for the jailbreak are detailed, and require the iPad to be connected to a Windows P/C. You downlaod the RedSn0w tool and install it on your Windows P/C/. Then, connect your iPad to the P/C, and put the iPad into DFU mode.  This is done by holding the home and power buttons on the iPad for 10 seconds, then release the power button, but keep holding the home button, for another 10 seconds. Then release the power button, and the screen should stay black.  This is the mode where iTunes can update your device.

There are several tutorials on the internet about this process.  Here is a link to one of them: 

    Tutorial on iOS 5.1.1 iPad Jailbreak:

The next step is to obtain and configure the OpenSSH/OpenSSL environment on your Windows box, if you have not already done so.  OpenSSH is the open-source "Secure SHell", it uses SSL ("Secure Socket Layer") and allows you to login to the iPad from your Windows machine, and transfer files to and from the iPad, using SCP, the "Secure CP" program (CP is the unix name for COPY).  The iPad is basically a Debian Linux box, with an Apple-built touch-screen interface.  There are several ways to get SSH/SSL onto your Window P/C.  I used the SSH install code from MLS-Software.  As of May 12, 2016, the most recent SSH-Install-for-Windows from MLS-Software installs OpenSSH 7.2p2, with OpenSSL 1.0.2g, which is current as of March 1st, 2016.  You probably want to keep the OpenSSH version fairly current, as it and OpenSSL are regularly updated to address discovered vulnerabilities. OpenSSH is typically used to login between Windows and Linux boxes.  If you install the SSHD (the Secure SHell Daemon) for Windows on your Windows box, you can login to your Windows machine from both the iPad and other Linux boxes, if your LAN and machine firewalls are set to allow that.  Learn more about SSH:

The SetupSSH Installer for Windows SSH/SSL:

Once you have SSH access to the iPad, the APL executables can be installed as directories in the /var/mobile/Documents folder, where they will show up as available in the C: drive, when DOSpad is started.  You do not need to modify the dospad.cfg file to mount C:, as DOSpad already mounts /var/mobile/Documents as C:.

Confirm you have SSH installed on your Windows box by checking the version number, which will be shown if you use the "-V" option.  Eg: from your Windows box, in a Cmd window, where you see: "C:\> "  enter "ssh -V", and confirm you get: "OpenSSH_7.2p2g, OpenSSL 1.0.2g 1 Mar  2016" in response.  Try a login to root, from Windows.  Note: you have make sure you have assigned an IP# (or your router has given one with DHCP), to the iPad.  Check the settings on the iPad to see what it's IP number is, then use that to log into it.

Login to iPad, with IP#, from Cmd Shell in Windows...

     C:\>  ssh   ( is your IP#,  something like

The root password on a jailbroken iPad is "apline".   Once you are logged in, try an "ls -l" command to list your files.  I run a static IP#'s in my lab.  Each machine is assigned a non-changing IPV4 number. Typically, your router will assign your iPad a temporary IP#.  On your iPad, click Settings, then WiFi, then click the arrow to the far right of your local network name, and you should get a screen that displays your local IP number.  Use that number above to access your iPad. 

Files can be migrated to and from the iPad using "scp", the secure copy program.  The Redsn0w jailbreak results in two available login id's, "root" and "mobile".  Both have original Apple passwords of "alpine".  Download the terminal emulator from Cydia, start it, run "su" and login as root, and use "passwd" to change the default passwords for root and mobile userids as your first task.

Once the jailbreak is complete, and OpenSSH is installed, the DOSpad application has to be installed.  You have to copy  the .DEB file for DOSpad onto the iPad, and run dkpg to install the Debian package file on the iPad.  The .DEB file for DOSpad can be obtained from the site.  URL is: 

DOSpad for iPad: (Cydia version):

The DOSpad source code is on Github, at:

You do not need the source code to do this, but it is available if one is curious.

To install a .DEB file, there are again a few different methods.  I used "scp" to copy the "dospad.deb" file to the iPad directory /var/root/Media/AutoInstalI, which is Method#2 in the tutorial link.  You might have to create the AutoInstall directory, under ../Media.  I prefer the manual approach, and it is done using the "Terminal>" icon on the iPad, using "su" to start "root", and then manually running "dpkg".  If you copied the dospad.deb file into directory /var/root/Media, then you should be able to run these steps in a "Terminal" window on the iPAd:

    su [ENTER]
    alpine [ENTER]  <or whatever the root password now is, since you should have changed it!
    dpkg -i /var/root/Media/dospad.deb [ENTER]

Here is a tutorial about installation of ".DEB" files on a jailbroken iPad.

The final step is to use "scp" to copy over your sAPL directory, with sub-directories, from your Windows box.  Before you do that, it is a good idea to create the sAPL directory on the iPad.  Do that using the Terminal application, which you can you can get from the Cydia application.  Cydia is used to download open-source software for the jailbroken iPad.   On the iPad, click on the Cydia icon, and search for and download the "Terminal" utility.  It lets you start a terminal session on the iPad, and create a directory for sAPL in /var/mobile/Documents.  Just: 1) Start terminal 2) cd /var/mobile/Documents  3) mkdir SAPL.  Check the empty directory is there, and in the right location, with "ls -l".  From any directory, you can enter "pwd", which is a unix utility to "print working directory", which shows you what directory you are in.    If you "ls -l /var/mobile/Documents", you should see a blue coloured filename called SAPL.  (We will be using DOSpad, which is case-insensitive). 

Now, you can run an "scp" from your Windows box, to the iPad.  If you have an sAPL directory on your Windows P/C, you can migrate the entire directory, with subdirs, from the Windows box to the iPad, with a single "scp" command. 

Eg:  c:\> scp -r \aplstuff\sAPL  mobile@

The above command copies the sAPL directory from Windows directory C:\aplstuff\sAPL to the /var/mobile/Documents/SAPL directory on the iPad.

To run sAPL on the iPad, click the DOSpad icon, and when the "C:\>" appears, simply enter "CD SAPL", which takes you the SAPL directory.  From there, just enter APL (which starts a small batch file, to set up the font, etc.).  You can exit sAPL and return to DOSpad "C:\>" prompt with ")off".  The APL characters, in sAPL can be accessed using an ALT-key sequence from the displayed DOSpad keyboard.  If you use the sAPL on the iPad in landscape mode, you can have full access to the DOS-based screen.  (Try turning it to portrait-mode, and see what you get then... Some artist worked pretty hard to create an image - complete with a yellow sticky Post-it note - of an old 80386 based P/C!).

"Where can I get sAPL?"  Of course, a critical question.  I have a "" file which is the full sAPL package, including fonts.   The total available workspace size is 312,722 bytes, a very tiny workspace, (you get the size using []WA (quad-WA) function, in a clear workspace), but sAPL is fully free and when I purchased my copy, direct from IP Sharp's head office in Toronto many years ago, it came with a licence that not only allowed me to make copies and share it, but in fact explicitly encouraged this.

I have put a full copy of sAPL up on Github, available as the "" file, which can be downloaded and installed to any subdirectory on any DOS-enabled device, including most versions and flavours of Windows.  Just go to the Gemesys_repository on Github, and click the "Download ZIP" button to get it.  The whole thing is tiny, and contains a collection of sample experimental workspaces in the "apldata" subdirectory.  If you have Windows-XP/SP3 or above, you can just copy the "" file into an empty subdirectory, and use file explorer to extract all the files into that directory.  Once the zip extract is done, go to a command shell (find Cmd shell in C:\windows\system32\cmd.exe, on old Windows-XP/SP3 machines, or use Powershell on newer Windows versions), and run APL from the C:\> prompt.    Here is direct link to the Github Gemesys_repository:

Get sAPL here:

Hope this is useful for some of the APL experimenters out there...

Happy array bashing!  Cool

GEMESYS Apps for Android - on the Google Play Store:

gDOSbox has over 50,000 downloads on Google Play Store

The following GEMESYS Android Apps are available on the Google Play Store:

gDOSbox  -  This is a full-featured implementation of the DOSbox-0.74 open-source DOS emulator for Android.  It was developed for Android version 4 (KitKat series), and was recently upgraded to work on Android 5 series (and above) devices.  Recent changes by Google to their keyboard have caused issues on some devices, so we strongly recommend the "Hacker's Keyboard", by Klaus Weidner. 

Download "Hacker's Keyboard" from the Google Play Store, then use the Settings icon, scroll to "Language and Input", and select/invoke the "Hacker's Keyboard".  Then, in the "Default Keyboard" option, choose the "Hacker's Keyboard" as your Default Keyboard.  The Google keyboard attempts to hijack *all* user input, and damages the gDOSbox interface routines.

gDOSbox is a full DOS implementation, with corrected math routines, which allows DOS .exe files to be run on an Android tablet. 

GNUplot37 - A version of the GNUplot graph generation tool.  Allows data to be quickly plotted in two and three dimensions, as well as supporting math processing, and curve-fitting to data, and displaying the result.  Try it with:  "plot sin(x)" to see a sign wave.  Then load the demo (hundreds of examples) with "load 'all.dem' ".   To clear the screen, (if using an on-screen keyboard), use "!cls", and use "!dir /p" to review all the GNUplot examples available.

sAPL      -    The original IP Sharp 32-bit APL, which runs in an emulated IBM 360/75 environment as series of .exe files, orginally released to run on IBM P/C's, and them made into a freeware product by IP Sharp, to encourgage APL usage education. APL characters are generated by ALT-key (eg. ALT-L creates the APL quad character, ALT-[ creates the assignment operator, etc.), so the Hacker's Keyboard is required.

APLSE    -   The STSC APL freeware product, directly downloadable from the PlayStore.  (You do not need to install gDOSbox, it is loaded first).  This is an excellent small-footprint APL, which has full graphics support.  It is reliable, and was released as a freeware product to encourage and assist APL education.  Like sAPL, the APL characters are created using ALT sequences, so ALT-[, for example, is the assignment operator.  The "Hacker's Keyboard" is required.

TryAPL2  -   The IBM full featured "TryAPL2" product, which allows a subset of early APL2 to be run on a P/C.  This is a working APL, which includes IBM's variant of the enclosed-array extensions.  APL characters are generated with shift-letter sequences, so gKeyboard can be used with this APL.

WatAPL  -    The original Watcom APL, circa early 1980's.   This was recovered of of an original Watcom APL System floppy diskette, and dates from 1984.  It can be used with the gKeyboard, as the APL characters are generated with Shift-key sequences.

gKeyboard - A basic keyboard, with the APL characters shown on keytops.  Useful for TryAPL2 and WatAPL, and for learning the location of APL characters on the keyboard.

All GEMESYS software is freeware for educational purposes, and contains *no* advertising or in-app usage monitoring or tracking.

The seven GEMESYS apps for Android. No *root* access is required to run any of them!

Some Fractal Examples on iPad - Logistic Equation Tent-Map

I've just put APLSE on the iPad, and thought I would calculate and generate a graphic of the Logistic Equation phase-space, as a fractal example.  For those who study or work with fractals and Chaos Theory, the "Tent Map" is well known.  That was my first example.

But in almost all cases, examples of the "Tent Map" are displayed with the initial results dropped from each vector of oscillations.  And the negative side of the chaotic region is rarely shown.  Plus, it is possible to extend the display beyond the regions where the calculated vectors typically "blow up", just by branching out before your code generates a "limit error".  This provides an interesting image of what is really happening as the generative process unfolds via the feedback (or should we call it a "feed-forward"?) algorithm.  Both images were calculated and image generated on the iPad.


Logistic Equation "Tent Map", generated on iPad using APLSE

Chart of example raw price-data series, with 20-day simple moving average, showing breakout and run up to a "resistance" level.

The Benefits of Using Datascience. The model overstates the case, but it looks like it noticed something, as the price has moved higher each day. And a dividend-discount model suggests a target of $130 is not unreasonable, given the current yield curve. How did the model know this?

First Tensorflow Program (from "Getting Started" Tutorial)

TensorBoard Graph Image of Simple Linear Regression Model. This image is generated from event data Tensorflow writes to its log file. TensorBoard reads the log file, and generates visual images to illustrate your model structure and how the training operation ran. You access the TensorBoard displays at localhost:6006, using your web-browser.

Example of Xerion on Fedora Linux, running Xor2 test-case network.

Xerion/UTS running Xor2 with Unit & Link Display. Background shows Xerhack, a visualization tool built using Tcl/Tk-Canvas.

Left side is Xerion on Linux, right side is Actnet function in sAPL on iPad, with same network weights. Example training cases produce same network output, both platforms.

Neural Network output chart (Actual & Predicted) for old D-Mark data (late 1990's), after 90,000 interations, using steepest descent, fixed step search. Basically, just curve-fitting the time-series. Not really very useful, but shows success of the Xerion backpropagation algorithm running on the Linux platform. Chart produced in GNUplot, viewed via Ghostscript.

This is just a chart of today's action in the Cdn-Dollar. It's like the output of a square-wave generator - zero rise-time, a bit of noise on the top flatline, and then back to where it started, 100% mean-reversion. Floor-traders on the old (and very sophisticated) commodity exchanges many years ago, would call this "Shake Da Money Tree!", or sometimes "a gun-run", (from "gunning for stops"). where the market would zoom in one direction, take out all the stop-loss orders, and then zoom right back to where it started from. This is not a random process or phenomenon.

Prototype of an Augmenter-AI. Do the kind of data-science you need to do to actually make money, and run it on a small Android tablet, that you can carry in your pocket. Your counterparties are *all* using AI technology, so each fellow in the field (or the barn) had better have access to it as well. Oh, and yes, the M-wave analyzer says the probability of a hard upturn is quite high, now for this series. I use this data series because it is nice and choppy, and screens well.

This picture is of the Probability Calculator - here shown running on Samsung Tab-A tablet. It tells you - for a given level of capital, and a chosen level of acceptable risk, how big a bet should be, and what the expected outcome is. Everyone has this type of tool now, and there are few asymetric opportunities for arbitrage. But they do sometimes still come up.

And here is the Probability Calculator running on the (*hacked*) iPad. This shows an estimated probability density function for a possible trade with a 20-day duration. The underlying market database can be migrated to the iPad from the desktop box via secure copy (Linux utility "scp", given that one has Cygwin tools to support "ssh" (secure shell) on the Window's box that maintains the data. ). The idea, of course, is to have a series of neural networks watching all the data in something close to real time, and migrating information like this to the tablet, where visualization can be used to sanity-check the network's recommendations, before pulling the trigger on any given trade.