Sunday, April 30, 2017

Sometimes things just have to go

 

IMAG0337There are times when you can not avoid some obstructions the way of a clean composition, Usually it is best to walk around a little and it is surprising how easy it can be to avoid signs, lampposts etc. Other times it does seem to be impossible to avoid distracting objects in a good composition. You might avoid that sign but then a tree or parked car pops into frame.imageIn such cases it is handy to have a context sensitive easer, (or alternatively a cloning tool). There is a really neat context toll in OnOne 10 Enhance called perfect Eraser. It works best if you just remove a few sections at a time, but it does and amazing job with a relatively few strokes. Photoshop has a similar tool and lightroom has a liner clone tool, similar to the spot removal tool and make need a few more corrections. The results can make a significant to your compositiomIMAG0337-Edit

Friday, April 28, 2017

Stepping into HDR

EV Bracketed Setup on my cameraWhilst I am definitely not in favour of setting fixed methods (workflows) for others to follow I figure it would be worthwhile to outline the steps I take to produced HDR images. I’ll be doing it in four different four ways, starting with my current most favoured approach, using Lightroom and Nik HDR Efex Pro 2

I start most of my HDR work the same way, using the bracketing on my DLSR. I have experimented with the number and magnitude of the EV steps. Just three steps of ± 1.0 EV seem to suit a more restrained HDR approach. This give me three exposures, one underexposed, one close to best exposure and one over exposed.

_IGP6682 _IGP6683 _IGP6684

For the first example I’m using Nik software’s (now google’s)HDR Efex Pro 2 plug in via Lightroom. Whilst lightroom has a mechanism to transfer photo out to plug-ins it can only transfer a single image at a time. So instead on the Edit-in tool I need to use the export function and the option google/HDR Efex Pro 2

The Lightroom export required

Lightroom will take a little while to export these three files and then start up Nik HDR merge. This occurs in two parts, the first is the Merge Dialogue, which offer three ways to avoid common artefacts and problems when taking multiple photos. This include aligning photos (step 3) when taking the HDR handheld, this is the default so no action is required unless you had previously turned it off. Ghost removal (step4) which searches for and removes ghost images associated with things (usually people) that move between the three exposures. The grey frame around the upper photo indicated the photo that will be used as the ghost removal reference. The strength of 100% is the default but allows you to soften the degrees to which ghost removal is undertaken. Whilst I’m sure this step takes time, I just leave them always on, as it has no impact that I can find when there is no movement. The final artefact (step5) is chromatic aberration (coloured fringes at the edge if strongly contrasting parts of the photo, eg. tree silhouettes against the sky) because the HDR process can over emphasise this effect. Different lenses and lighting will give different effects, I have found more the upper slider to remove red effects and the bottom slider a little less towards the blue works well for most of the photos taken with my Pentax camera and Tamron lens combination. There is a magnifier at the bottom right of the reference image if you want to check who these setting are being applied. The lightness slider underneath the image helps you see and check detail in the HDR image. It does not effect the the processing of the HDR image in the tone mapping steps to follow.

Merge controls in NIK HDR Efex Pro 2Clicking on the Create HDR button at the bottom right, starts the corrections you have set and applies them to the individual exposures and the merges the three images, It can take a few seconds and then takes you onto the interface controls for the tone mapping step. This  is the dialogue window in which you will do all the remaining image adjustments. Nik have provided a library of presets which show on the left hand side, grouped under several effects. When you begin using the software it is probably a good idea to try the look of several presets, a lot of them are over the top. I quickly found I only liked the realistic group and then only a few of those, I personally find the Balanced Preset (step 6) gives me a good starting point.

Tonemapping Interface controls in NIK HDR Efex Pro 2

However It is just a starting point, and I suggest moving over to the right hand side and working down through the options to edit your image. The first is tone compression (step 7), these sliders can have a really significant effect. The tone compression moved all the way to the left will leave the dynamic range much close to a single photo (flatter) whereas move all the way to the right gives you an extreme dynamic range putting detail in the shadows and highlights (it will usually look surreal and  over-powering), but a slight move to the right often makes the image snap. The methods and method strength also control the tone mapping but in specific ways. Depth controls how the shadows and highlights are to be treated and again moving to the right will enhance the dynamic range, normal is fine for me. Detail is a lot like clarity in lightroom or structure elsewhere in Nik software, it controls the local contrast and can bring up a lot of detail, realistic is fine for me. Selecting the histogramDrama is like a contrast control and different images respond different, I’m using natural here but moving to the right usually adds drama particularly to a cloud sky.

For the next step Tonality I like to have the Histrogram displayed (step 8) on the lower right panel (normally it is a loupe/magnified view)

Toneality controls in NIK HDR Efex Pro 2

The tonality controls are  very similar to the Basic tonal controls in Lightroom, but the work across the new combine dynamic range of luminosity in the HDR image. They work on the image as a whole. I wanted this image to better fill the histogram of the available dynamic range to I increased the exposure a little, stretching the whole histogram to the right. I also increase the highlights and lessened the blacks (lifting the left most part of the histogram (step 9). I also increased the saturation just s little (step 10). These last two steps could have just as easily been made in lightroom but it is convenient to do them here.

Normally I would finish here, but I found the wire in the background distraction so I then used OnOne 10 enhance and the magic eraser (a context sensitive fill)  to remove them. Also I cropped the photo a little.

G&T :: Backlit Glass

Sunday, April 23, 2017

ClimArt Exhibition

My “Paradise Lost” painting is currently on display (and for sale) at the ClimAart Exhibition at Artspaces Gallery in Wonthaggi. The exhibition runs until May 29th If you are down that way please drop in. Below are some deep dream style “sketches” of the exhibition opening created on my phone using prisma.

 

IMG_20170423_193846_processedIMG_20170423_194433_processedIMG_20170423_194242_processedIMG_20170423_193756_processedIMG_20170423_194647_processed

Tuesday, April 18, 2017

Directly Painting on a Tablet

I figured a good challenge for today would be to directly paint on my HP Spectre in Tablet mode using the HP pen. The Sun was coming out but there Corel Painter's Colour Wheelwere a few fluffy cloud around and I figured they would be a great subject. So I fired up Corel painter and set up a coloured canvas with a rough paper texture. I then experimented with the pastel “brushes” and a directional blender.

Obstacle one was the high reflection (glare) off the screen even in the shade of an umbrella. Similar to tring to sketch directly onto my phone. Just had to preserve with this one, sunglasses didn’t help, they probably imagehindered.

Concern two was out in the sun the colour wheel tool doesn’t seem to give me and even range of colours. The deep blues all looked purple and the cyan decidedly green. I switched to the mixing tool but still found the blue of the sky difficult to judge. The screen blue was intense but the sky was more so. I can see a lot of practise will be required on matching colour if I want to continue plein-air tablet work.

After a small siesta and a G&T I retuned to do another, this time more clouds around and I took a wider view and a lot more blending.. I did the colour matching from the shade and seemed to get more natural colours

clouds1clouds2

Directly Painting onto my Phone

I have been itching to try this out for a while (David Hockney has been doing it for years) and my new phone seemed the perfect excuse to try out some software. I had downloaded the free or trial versions of a number of app Wacom’s Bamboo Paper, Autodesk Sketchbook, Corel painter mobile, Adobe Sketch and an app I had been using previously called Marker.

I have to admit it was a joyful trial, more a test of my patience. Autosdesk & adobe’s app would work without signing up for an account (not here in the land of low band width. So did Corel it wanted to upgrade because my trial had run out? Strange since I just started to use it, but it did let me work. Wacom’s Bamboo Paper is really set up to exploit their pens but I could use the basic set of brushes. They also want you to purchase additional items. No thanks, for me its try before you buy. I just want to try out plein air sketching directly onto the phone!

1492410893613 created with Marker app    Created with Bamboo Paper

So I did get to try three options, first using Marker which simulates using marker pens. Way to bright and very limited colours but it free and easy to use. The Wacom Bamboo paper is a nice little sketch book/journal into which you add you new pages and sketch on them. The tools available in the free version are limited but quiet serviceable, I can see myself experimenting more with these. I was expecting corel painter to be easy to pick up (since I have the full corel painter software 2017 now) however the touch screen mode is different enough to frustrate me, particularly in choosing colours. I gave up my first attempt took a picture to trace over and ended up testing out the colour cloning ability (with a very purple result). I think it is an issue of picking colours on a highly reflective surface or maybe I’ve been out in the sun too long.

the old bearded heath created with Corel PainterThe Old bearded Heath as a clone painting

Saturday, April 15, 2017

Why I started using an Air Gapped Archive

I'm not paranoid or afraid of the NSA. I'm just trying to avoid the risk of ransom-ware or the corruption of a disk via a virus, mechanical malfunction or a range of other things that might affect an active network and hard disk always spinning. I realise my approach is somewhat hybrid and would not satisfy Edward Snowden or the security gurus but It does give me confidence that I have addresses the long term safe storage of my source photos.
The Little Netbook that became a Linux archive machineI am using an old Toshiba netbook (it served me well as a photography & email connection to the world as I traveled around Europe in 2011). However it was always slow and it was the first computer I tried to update to windows 10 and basically windows 10 killed it! I ran out of patience and installed Linux, and a little photo software but it sat unused for the past year-ish. Then it struck me this would be ideal for the Air Gapped Archive. I leave it turned off most of the time, turn off the WiFi & Bluetooth and just use it to manage my photo archives on an External Drive. It has 3 USB slots (unfortunately USB 2.0 not 3.0, ie file transfer will be slow) so I can do my transfer of files to the archive on this little Linux computer. I can also preview the files with XnView when necessary
image
The Air Gap isn't perfect because I have to get the photos onto the external hard drive (ie connect it to a network somewhere) and at regular intervals the archive must be updated, again I do this for convenience by creating a new copy of the archive and updating that. This is the Generational approach. The new copy is the SON and this become the new air gapped Archive. The old copy is the FATHER and can be put away till next cycle when It is rewritten as the new SON. In the old TAPE backup days this was often taken to a third GRANDFATHER generation. At present I only have enough spare hard disks to do two generations and I only plan to cycle them over every three months.
I plan to use small memory sticks (only used for that purpose) to extract files if necessary between generation updates and this in theory reduced the likelihood of large scale malware, and particularly ransom-wear but does not avoid it altogether (eg stuxnet). The fact than I'm also crossing operating systems significantly reduces the possibilities of viruses being transmitted.
The Linux computer and the hard drive  containing the Primary Archive are turned off and stored together in my studio rather than my office where the on-line collection of my photos is stored. Hopefully I will only need them every 3 months or so as I cycle and then update my primary archive.


Friday, April 14, 2017

Extending Dynamic Range

Contrary to want a lot of people believe Digital Cameras, even the best ones, are not capable of capture the full extent of light from the darkest blacks to the purest white. They struggle to get close to the capabilities of traditional film and they are certainly no where near as capable as the human eye. This range of illumination is usually referred to as Dynamic Range. This limited range coupled with an automatic light meter that is trying to get an average exposure (ie matching a mid tone grey overall) is the reason a lot of images turn out boring, washed out and flat looking even though you remember much stronger light (and often colour)._IGP6498 Untouched RAW Image as Rendered by Lightroom

Here are a couple of techniques that can help you get a wider apparent Dynamic Range in your final photo.IMAG0167 - AutoHDR using HTC UPlay CameraThe first is the use a camera app on your phone that will take an “AutoHDR” (this usually means it will take 3 exposure and combine then using software with the app, a few will do pseudo HDR a bit like the next approach.) This is the simplest approach and results can be surprisingly good.

_IGP6497A second approach is to take a RAW image (this is in .PEF format) and use software (I’m using lightroom Develop basic panel only.) I’m following what I once called SDR+ (standard dynamic range plus) and I even wrote a couple of presets to do it. The basic idea is I bring up detail in the shadows (usually moving the shadow slider to the right, sometimes I also change the blacks using the approach often called “finding the black point”, while on the black slide hold down the <alt> and the scree turn black,image the adjust the slider till a few whit areas (these are the areas that will be rendered pure black) show up. You can do a similar operation with the white slider. The objective is to get the illumination histogram to be as evenly spread across the range as possible. This doesn’t take all that long  and I prefer to do it manually to even my own preset, but it requires specialist software, some skills in the oftware and a camera that can take RAW files. Many RAW purist will explain this is why you should always take RAW images.

_IGP6496_HDRThe third approach is classic HDR (High Dynamic Range) and here I am using three bracketed +/-1 EV exposures. Then the Nik HDR Efex Pro 2 software and the balanced preset to create a new 16-bit colour .Tiff file. There are many variation of the HDR technique and software, but it normally involves two steps. First collecting and preserving the full range of illumination values from the three or more exposures. Then a second step of “tone mapping” these into the available dynamic range of your image (and this is also related to how you might reproduce the image, eg print or screen). The downside is these steps and any image prep take extra time and a lot of folk get carried away with the slider options in the “tone mapping” step, thereby producing the sickly vivid and disturbingly surreal images that have given HDR  a bad name. I personally think this method has give me a final photo much closer to the late afternoon light I was photographing in. I have to admit I do take a lot of HDR images, particularly in contrasty light or where I know I need to preserve the Highest Dynamic Range possible.

So if you are disappointed with flat washed out images you might like to investigate at least one of these approaches.

Thursday, April 13, 2017

How Wide can I go?

Whilst on my endless summer trips I’ve gone a bit overboard taking very wide panoramas. It very easy with a digital camera to rattle off the 30% overlapping images to make a stitched panorama and I have used Autostitch to do this mostly although Microsoft’s ICE is another great stitching program and there are lots of other programs and apps. Because the wide view suits so many of the vistas I’m visiting I’m also opening out my sketchbooks to paint across two adjacent Landscape pages. This is all wonderful and enjoyable in the field but it has a downside, it takes time, in fact the bigger the panorama I want the longer the stitching and other post processing of the photos takes. On busy travel and field days I simply run out of time and/or I’m likely to fall asleep waiting for the stitching to complete in the evening.

Sketch at Gypsy Point

As my LG phone got terminally ill, and misbehaved several times, I lashed out an got a new HTC U Play (why HTC, well it was not the cheapest but because my first HTC was super reliable and easy to use and is still going strong, as GPS in my camera bag, it was able to take over from the ailing LG). The UPlay was equally easy to use, the one downside is the beautiful case and screen as a bit slippery so a slip-on cover (red so I don’t leave it behind) was the first accessory. Then it was out to try out the camera, I just love it! Here is my first attempt at panorama mode.

Jell Park Lake Panorama

Now that did take a few seconds to stitch in the phone but it is a wonder panorama, I might have been able replicating this with a DSLR and autostitch, which would easily take over 5 minutes or longer!

Screen Captute of a recent 3 Across Instagram Post

My passion for the wide view has very much moved into my instagram accounts especially @apimageo, which is still very much stuck in the 3 across grid view (you have to look at my profile to see it). I also recently notice that @packtography (he’s worth a follow) includes the camera, exposure and post processing summary with most posts. I reckon this is a great idea, so I will try to keep up including such info. Sharing your settings to me is important as an insight to let other understand your work, BTW my flickr posts have always included the extra EXIF data automatically if you are interested.

Monday, April 10, 2017

Experimenting in Corel Painter

I’ve never been one to follow the “workflow” of the software experts, doing so is just likely to produce the same results as most others. I believe creativity comes with learning to use your tools, but in depth and a personal way. imageThere is nothing like just rolling up you sleaves and jumping in and going for it.

I have had painter lite, then painter essentials for a while so I’m familiar with the basic workspace, but I wasn’t prepared for all the different brushes. My first attempt to just try them all one after the other created a confused mess. I did already like the Sargent and Impressionist brushes so I figured I should stat with them and work through some of the brush settings, in clone modes (copy colour, or copy image), dab patterns,  blend/bleed and just straight colour selection from the colour wheel. I had a HDR image of the shadows under a backlit oak that looked promising.

_IGP6447_HDRI first made a very rough under painting using a very broad Sargent brush doing just a colour clone to rough in the basic composition and used it in a thinner size to mark in some of the detail. Quick and dirty just to establish the composition.

sargent brush underpaintingThen I switched over to the impressionist brush and varied between adding my own colour and using colour cloning. I also concentrated on making sure the direction of my strokes followed the shapes. Also that the direction of light as described by the highlights and shadows were consistent.

Final Digital PaintingEven though this was painted inside on a rainy day. I do think it has a spontaneous  freshes, not as obvious in the original photo (despite it being HDR).

Sunday, April 09, 2017

Seeing :: The Essence of a Tree

My SketchAs an exercise in Seeing, suitable for a PhotoWalk style session, I figured the ideal subject might be a tree. So first you need to make a sketch of an idealized tree (could be a line drawing, a rough tonal sketch, or even coloured in). This exercise works well in a group so let the other photowalkers see it but then put the sketch away. The challenge is to find and photograph a tree that most closely resembles you tree. It might be harder than you think. Especially if you don’t keep checking your sketch.

DeepDream_tree (Based on my sketch)Ok I cheated a bit I  knew there was a round tree

What is going on here. Well you are doing three things

  1. Previsualising the image
  2. Learn to see a photo before you take it
  3. Framing your photo to match your expected view

These are three skills to expand your seeing good photograph beyond just looking in the viewfinder or phone screen.

Thursday, April 06, 2017

Making A Hash Of Checksums

I come from a very deep computer background, (I first used computers in 1969) and have managed several mini computers (DG, PDP & VAX) and networks of PC so I am well grounded in the need for both Backup (keep a safe copy) and Archive (long term storage). Now if you read a lot of net advice you might think they are the same and just call that process backup. Backup is important, and a lot of people still don’t do it adequately and I have written a fair bit about it, but archiving seems to have escape my posting attention (other than my The Importance of Cataloging your Digital Photo Archive Post).  I need to rectify that and I will begin with a bit more detail on how I seek to avoid storing, and identifying corrupt file

MD5 checksums (aka hash values)

Utility in Total Commander that creates MD5 Hash Table

With the recent death of two western Digital 2GB external drives, I have become a bit paranoid about the potential for file corruption as a drive slowly fails (One drive just dropped dead the other slowly started to show problems). Because most of the files are in a binary format such corruption can easily go unnoticed and the corrupt file could easily be diligently backed up. There are solutions, like having mirrored drives (extra hardware & software and expense) or regularly reviewing the files (a challenges with lot of photos). I have found a much simpler approach, that is to use Example of the MD5  Hash Valueschecksums created from each file. If the file is corrupted the check sum will change. So each time a file is moved or the disk is rotated these check sums can be checked. I choose MD5 because they are a public format and there are a lot of utilities to create them and they are widely used to detect duplicate files (particularly photos, video & music Files. It would be nice if photo software undertook this task automatically but alas while I can see similar numbers in picasa and lightroom they are not true MD% hash values but just look like them. My only conclusion is they are propriety formats, which creates the risk that if the company disappears (google has already washed its hands of picasa) so does the long term suitability as check sums.  The process to create these files can take some time and I am using Total commander, which creates the check sum for an entire subdirectory (folder) and writes a single .md5 file containing the check sums. This is an ASCII file and can easily be read or a particular Hash value cut and pasted into a Example of  MD5 checking processdifferent utility to verify check sums. They are small and take up negligible space. To verify the checksums also takes a little time because the checking software must read through each file. Good utilities will be able to read the .MDR file and report missing files or errors (ie corrupted files). Total commander has a second utility to do this job. So the final big question is – “how do I know the photos are ok, not corrupted, when I make the MD% hash value?” Simple answer is I can’t be sure so I also have to at least look at the files, using picasa, the default windows photo viewers, Lightroom or Corel Aftershot Pro (which actually seems the fastest option particularly with RAW files). So setting up the checksums for a proper archive does take some time but but I can have more confidence in that archive as it is passed around different media and locations.

Wednesday, April 05, 2017

#AIart Shoot Out

I was certainly an early follower of the developments in the deep dream style art filters, which use neural networks to recognize and amplify aspects of any photographic images. I must admit I like the potential of the forward seeking neural networks that can replicate line work and other mark making characteristics of the training image. I  chose to this called Deep Dream Style, and starting giving such creations the  hashtag #AIart (which seems to have caught on on Instagram). My prediction that these might become over used Art Filters is probably also an astute observation, there is certainly a plethora of such apps. Fortunately most are based around famous artist and great art, Even Adobe is joining the crowd offering researching a AI style filter that can “copy” the look from another photo. I still think these filters can be used creatively rather than as “look at me” plagiarism so obvious in social media today.

Testing my new phone camera contre-jour

This photo was just an experiment, the first photo I took with my new HTC U Play mobile phone. I was trying out the AutoHDR feature and it handled the difficult strong lights and shadows well. Its just afternoon tea, aka Coffee & fresh figs. It is an interesting enough composition so I thought I would try the image across a variety of current #AIart tools (google deep dream style, on computer, Dremscopeapp & Prisma on phone)

google deep dream styleDreamscopeappPrisma

Which rendering do you like best?