I want to be able to use Self on my tablet and phone, but of course it is difficult since Self assumes at least a two button mouse - one for the main (orange-ish) menu and one for the blueish morph menu - and at least on MacOS it allows for using the trackpad to scroll easily around the Kansas infinite desktop.
Long term it would be great to map gestures to Self events but to do that we need to replace the current VNC/X11 setup.
I tried an experiment where the desktop could itself be dragged around, but the latency was too much. The delayed response to dragging was worse than no feedback.
So instead I’ve created a morph which pops up when the desktop is clicked.
It is shaped like a circle, with two arcs coloured like the menus. If you drag from the circle and release, you will scroll the desktop (or more accurately move your view within the desktop).
If you drag but then release inside the circle, nothing happens.
If you drag onto one of the coloured patches, the next click will be counted as that mouse button.
So to get a background menu, drag and release over the orange patch. Then click on the desktop and the background menu will come up.
To get a morph menu, drag and release over the blue patch. Then click on the morph you are interested in.
This isn’t perfect but at least for me makes working on an iPad possible, and the scrolling part is also useful on my laptop for moving around Kansas.
There is of course a preference to turn this off: preferences desktop useViewScrollMorph: false
I’ve put this up on OurSelf.io http://ourself.io/, and it would be fab if people could try it out and let me know any thoughts, suggestions etc. Just start a trial and play around the let me know.