Focus Stacking with a Raspberry Pi – The Quest for Repeatability and Predictability

    Repeatability  – (of an experiment, etc) producing or capable of producing the same result again

    Predictabilityconsistent repetition of a state, course of action, behavior, or the like,making it possible to know in advance what to expect

    ………………………………..wouldn’t that be nice?

    Last Tuesday at Tetbury Camera Club we had an inspiring talk by Jay Myrdal who explained how he made stunning commercial images back in the day before Photoshop when it all had to be done in camera – a plate camera quite often. The techniques he explained were fascinating and the results were truly stunning. Some of his images involved taking action shots such as an exploding light bulb – this would be just a part of the image but he would have to shoot it over and over again to get all the timings of the effects just right. He built a rig to automate and adjust the various factors involved so he could tweak the important parameters knowing exactly what he had done beforehand and so migrate towards an optimal set up. He also scrupulously calculated all the angles involved in multi-layered super-imposed images to get things to look right from the perspective point of view.

    Double fly
    Double fly

    This got me thinking that I needed to apply his philosophy to my macro photography, which up to now I have been carrying out on a rather hit and miss basis in terms of choice of lens configuration, and adjustment of stacking parameters. So I decided to apply some of Jay’s thinking to the matter.

    My rig is pretty much automated already, but the key issue here is recording all the parameters that I have used for each stack in enough detail to enable evaluative decisions to be made after processing (which may be some time later) so that the parameters for the next stack can be improved or consolidated. I already have lots of bits of paper lying around with scribblings of what I have done but no real way of tying these back to the images I have processed. And not everything is written down anyway. It needed organisation and preferably automation.

    I was already in the process of updating my Raspberry Pi Python software for my focus stacker to add refinements based on the last few months shooting. As a result, my mind was in tune (as much as it will ever be!) with Python 2.7 and Tkinter, and then two and two came together and for once made four! I decided to automate the logging of my focus stacks using my RPi.

    I won’t bore you with the details of how I did it – programming and more programming is all you need to know! But this is what it now does.

    There are some user interface windows to enter information that the RPi doesn’t know such as what camera is being used, the lens configuration, the subject matter, and the camera settings. You can also enter a note for each stack. The software gathers up all the info it already knows about the shoot such as the near and far focus points, focus increment and so on; date and time stamps the log, gives it the next shoot number and then (and only) when the shoot sequence button is pressed, the whole lot is appended to a .csv log file.

    The .csv file can be read by Exel and as long as the camera and RPi are telling the same time I can figure out which log goes with which image stack.

    One hiccup in proceedings is that the RPi is not on the internet, and so the RPi cannot find out automatically what time it is. That’s because it doesn’t have its own real time clock and battery (it apparently kept the cost down) and relies on the internet to reset its clock each time it is turned on. No internet, and the clock just restarts from where it left off last time it was on! Not terribly useful. At the moment I have to enter the time manually into the RPi when I turn it on – it is very easy [sudo date -s “Sat, October 18, 2014 17:21:00] for example. But I might forget.

    What would be better is to get the lap top to tell the RPI what the time is. That is all within the realms of possibility. I would have to set up the lap top as an NTP server which Windows 7 does support. It seems to require some mining in the registry to do this, something I am not too keen on doing. There is plenty of advice on the internet, but I will have to nerve myself up before trying this. Then you have to rummage in the RPi’s software settings too. I may have to get my (no-longer) resident expert to help me on this some time.

    Finally, I have also made some measurements to establish/check the exact magnifications of my various lens configurations, and have produced a chart to enable me to select the best lens configuration for a given subject size. It requires measurement of the subject height to get it right first time – something I had not been bothering to do. The chart is shown below

    lens data
    lens data

     

    Focus Stacking with a Raspberry Pi – Connecting the Pi to Hardware – Testing

    There has been a lot of progress, and the build and testing are advancing.

    20140111_home_011
    Here you can see the finished wiring neatly finished with cable ties ( I hate untidy wiring!). The Sainsmart relay module in the foreground left.

    The new 2-channel Sainsmart relay module arrived; it worked perfectly without any difficulties, so I must assume that the cheepo version was defective out of the box. With all the hardware now available the only thing left to do was to connect up the Pi. To reduce the chances of getting something seriously wrong I prepared a table of connections to the Pi GPIO header so I knew exactly which wire went where, with insulation colours, pin numbers and destinations all clearly defined. I have decided to adopt the header pin number rather than the BCM2835 nomenclature in the software GPIO code as this seems an easier to understand option; less likely to cause confusion when connecting up the header ribbon cable; and easier to double check between code and hardware wiring. Then it was a matter of connecting things up carefully and testing them one at a time in the order motor, relay module, and individual inputs and outputs.

    Table of wiring
    Table of wiring

    Except that all did not go exactly according to plan!

    Having connected up the motor control board to the Pi, it was necessary to move to a new version of my programme which replaced the code for the “virtual” motor with the code for the correct GPIO callouts for my stepper motor controller. When I tried to run the new version, the Pi helpfully reported back that the programme needed to run in root because GPIO had to be controlled from there. That was quite easy to do by running the programme with the prefix “sudo”. However when I did that, my remote terminal running MobaXterm threw up the message:

    X11 connection rejected because of wrong authentication…..

    So now the Laptop could not display the GUI. That had worked fine when running from the Pi home directory. It seemed like an impossible conflict of demands. Googling the problem just returned a small number of results which just added to my confusion with no solution that I could understand how to implement and why it would work; I don’t like implementing things that I don’t understand! Luckily, Jim came to the rescue and provided a very simple and easy to implement solution that did the trick immediately. All I had to do was add a line at the end of the file .profile that lives on the pi in /home/pi:    export XAUTHORITY=/home/pi/.Xauthority

    20140111_home_010
    The laptop connected to the Pi via a network cable and interfacing through MobaXterm.

    A quick reboot and then we were off! Over the course of several days the various peripherals were connected and tested, and the Python 2.7  code modified here and there to introduce the GPIOs one by one and to fix the bugs that came with them that had not surfaced earlier.

     

    Everything is now working more or less as intended. However, the stepper motor does seem to run quite slowly, even with the minimum time interval between pulses. I can implement either of two gearing options with different pulley sizes and I have swapped over to the step up ratio rather than the 1:1 ratio that it started with. The difference is that it now takes 450 motor steps per mm compared to the previous 1550.

     

    20140111_home_012
    Everything is now functional on the control panel apart from the top LED marked PI which requires a separate piece of software code

    I have noticed what appears to be the effects of stiction in the gearing and the rack, which is causing a very slight jerky movement of the stage as the rubber band drive has enough give in it to allow this to happen. Jerky movement is not a problem in itself but unequal movement between successive shots would be a big problem. There is very little backlash in the gearing that came with the DVD rack and so I am fairly sure this is the result of the rubber band stretching; so I am probably going to have to replace the rubber band drive with a geared drive. Gears are on order.

     

     

    Still lots to do:

    • Finish implementing the software – mostly minor cosmetic issues apart from error/exception handling which needs a lot more work.
    • Tidy up the programme software to get rid of redundant code.
    • Implement a routine to turn on an LED when the Pi is booted up; so that I know when it is ready to connect to the laptop.
    • Build the stage platform and individual detachable mounts.
    • Devise an alignment tool for setting the height of the specimen on its mount, so that it can be prepared elsewhere and will then align perfectly with the central axis of the lens when installed on the stage.
    • Figure out how to put in place and light suitable backdrop(s); ideally different backdrops for different effects for different types of specimen.
    • Draw some neat circuit diagrams so that in six months’ time I know what I did!
    • Work out a diffused lighting arrangement and make necessary diffusers; possibly ping-pong balls?
    • Fix the broken LED light – too much heat applied when soldering – oops.
    • Add a warning beeper.
    • Tidy up the manual remote camera shutter trigger which is hard wired. It enables me to trigger the shutter when setting up to take test shots without using the programme features and without accidentally jiggling the rig. I am looking for a nice (inexpensive) push button on a wire – the sort of thing that used to be used for advancing slides on a slide projector. Used to have one in fact, but couldn’t find it now that it might have had a new lease of life.
    • Replace rubber band drive with gear wheels? Then recalibrate the rig.
    • Build a dust cover for the working parts
    • Find some specimens
    • Test, test, test!
    shoot seq
    Screen shot of the GUI for controlling the actual shoot sequence. The stage advances from the parked position to the first shot position and pauses so that I can check everything is good to go. If OK then a click on the “TAKE SEQUENCE” button and it shoots the rest of the sequence. Counters show progress. The GUI uses Tkinter for its implementation. Code is written in Python 2.7.
    dry-run screen
    Screen shot of the GUI for checking the set up. This is a for checking the set up after all the parameters have been chosen and before shooting the sequence.

What’s it all about?

Here are my jottings about my photographic projects and activities. I have been working on a focus stacking macro photography rig. There are quite a few posts about that. In addition I write about other photographic activities as and when!

Archives

Category Specific RSS

Share


Warning: A non-numeric value encountered in /home/public/mhp_blog/wp-content/plugins/ultimate-social-media-icons/libs/sfsi_widget.php on line 238
RSS
EMAIL
Facebook
Google+
http://www.picsbymike.co.uk/mhp_blog/tag/tkinter/">
Twitter