Posts Tagged ‘test’

Wednesday, March 24th 2010

Speed: Lightroom 3 beta 2 vs Lightroom 2

I’ve done a few timed tests of some heavy tasks in Lightroom to see what progress has been made since the first beta of version 3.

Importing

Here’s what I wrote about the first beta back in October:

I imported (added to a new catalogue) two folders containing a total of 295 photos from my Canon EOS 30D. I chose to render 1:1 previews at the same time. This took just over six minutes in Lightroom 2, but almost three times as long in Lightroom 3 beta!

This time I did a slightly different test, but it is obvious that beta 2 is a lot more efficient than beta 1. Now I imported 200 raw photos from my memory card in the card reader and chose to render 1:1 previews.

The actual import (copying of the files) took roughly 2 minutes 50 seconds in both versions. The rendering though took 4:25 in Lightroom 2 and 6:29 in Lightroom 3. That is almost 50 percent longer.

Still, it is a great deal faster than beta 1.

Exporting

I also timed how long it took to export 100 raw files to full-size JPEGs. (No sharpening in the export settings.)

Lightroom 2 took 2:31 and Lightroom 3 beta 2 took 2:55. While it is still 16 percent slower than Lightroom 2, the new beta is a lot faster than the first beta. Here’s another quote from my previous review:

I also tried exporting 82 photos to full-size JPEGs. This took 95% longer in 3 beta, even though it was using roughly 80% of the CPU compared to around 63% for v2.

Navigating Through Photos

I tried to think of a way to measure how quickly Lightroom displays photos when flicking through them. So I decided to measure how long it took to click through 19 photos in the develop module, while waiting for the Loading sign to disappear for each photo.

Obviously, this method does introduce some user error, but I still believe it is accurate enough to give an idea of the responsiveness of the two versions.

The test took 45 seconds in Lightroom 2 and 57 seconds in Lightroom 3 beta 2. That’s 27 percent more time. Lightroom 2 definitely felt snappier too while doing this test, so I think it is a fair result.

Conclusion

Adobe have managed to make huge improvements to the speed of both importing (preview rendering) and exporting since the first beta of Lightroom 3. Beta 2 still lags behind Lightroom 2 in these tasks, but I feel the speed difference is not a big deal any more considering how much the image quality has been improved thanks to the new rendering engine.

Friday, January 22nd 2010

Sharpness test: Sigma 17-70mm vs Canon 17-55mm

After having bought my second hand EF-S 17-55mm f/2.8 I sold my Sigma 17-70mm f/2.8-4.5. But before I sent it off to the buyer, I took some test shots for a little comparison of the two lenses.

I set my camera up on my tripod and took shots of our bookcases from roughly 2.5 meters away, at a right angle. I took photos at 17mm, 35mm and 55mm with both lenses, and at each focal length I took photos at f/2.8, f/4.0, f/5.6 and f/8.0. (Obviously, the Sigma doesn’t do f/2.8 at 35mm and 55mm.)

One thing I did notice fairly soon was that the Sigma’s autofocus was much less reliable than the Canon’s. For some of the shots I ended up having to manually hunt for the optimum focus distance. The Canon got it right every time.

From each test shot I have cropped out sections from the centre, mid and edge areas. All in all, 66 squares of 300×300 pixels, which I have ordered in (hopefully) pretty tables below.

As you can see in the overview photos, the sections are taken from different places for the different focal lenghts. (To use the areas of the bookcases with most detail in them.)

I’ve put my own conclusion in words at the end, after all the tables.

Overview of sharpness test of Sigma 17-70mm f/2.8-4.5 and Canon EF-S 17-55mm f/2.8 IS

17mm – Centre
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
17mm – Mid
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
17mm – Edge
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS

Overview of sharpness test of Sigma 17-70mm f/2.8-4.5 and Canon EF-S 17-55mm f/2.8 IS

35mm – Centre
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
35mm – Mid
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
35mm – Edge
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS

Overview of sharpness test of Sigma 17-70mm f/2.8-4.5 and Canon EF-S 17-55mm f/2.8 IS

55mm – Centre
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
55mm – Mid
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
55mm – Edge
Aperture Sigma 17-70mm f/2.8-4.5 Canon EF-S 17-55mm f/2.8 IS
f/2.8 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/4.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/5.6 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS
f/8.0 Sharpness test of Sigma 17-70mm f/2.8-4.5 Sharpness test of Canon EF-S 17-55mm f/2.8 IS

Conclusions

On the whole, in almost all of the little squares, the Canon is running circles round the Sigma. No pun intended actually.

Surprisingly though, the Sigma looks sharper than the Canon in the centre and mid areas of the frame when using f/2.8 at 17mm. The Canon seems to suffer from some kind of fringing here. (At the edges though, the Canon is better.)

To summarize, it was much as I had hoped and expected. But I would be lying if I said I wasn’t a little disappointed with the Canon’s performance at 17mm. At the same time I don’t think that fringing will be very visible with most subjects. It would take a lot of fringing to out-weigh the benefits of having image stabilization.

Tuesday, May 5th 2009

Autofocus test of the Sigma 17-70mm

While reading about the Canon EF-S 17-55mm, which I’m considering getting second hand, I saw an autofocus test where it performed flawlessly. That is, finding perfect focus 20 out of 20 times. Not all other lenses tested did though. The Sigma 18-50mm f/2.8 and the Tamron 17-50mm f/2.8 missed a few. (See the test towards the end of this page.)

This got me curious about how well my own lens focuses. (Obviously, the camera plays a part here too, probably a fairly major part.) Anyway, I set up a similar test with my Sigma 17-70mm on my Canon 30D.

I put the camera on my tripod, about 1.5 meters from a frame with our wedding photos. I aimed the centre focus point at the dark edge of a photo, like this:

Autofocus test of the Sigma 17-70mm lens.

Then I manually set the focus at infinity (or beyond infinity actually), activated the autofocus again, and simply pressed the shutter-release (cable).

I repeated the process twelve times, and then did another twelve setting the focus at the closest focusing distance and another twelve setting it at about 0.7 meters.

In all three cases, the lens got 12/12 sharp, focused shots, like this 100% crop here:

Autofocus test of the Sigma 17-70mm lens.

These shots were shot at 48mm, 1/20th sec, f/4 and ISO 400, to give you a feeling of the amount of light. The room wasn’t dark, not very bright either.

Not a very thorough test, but at least it shows the autofocus is no where near lousy, as some people try to make out about this lens.

Monday, March 30th 2009

Beta Browser Battle 2: Page-load times

This is part two of my comparison of the latest browsers. (Part one is here.) This time I compared page-loading times, just as Betanews recently did.

The browsers I’m comparing are …

  • Firefox 3.1 beta 3
  • Safari 4 beta
  • Internet Explorer 8
  • Opera 10 alpha
  • Chrome 2 beta

I compared the browsers on five different sites / web pages:

Results

Let’s just get straight to the results. I’ll go through my methods later.

Graph showing page-load times for Firefox 3.1 beta 3, Safari 4 beta, Internet Explorer 8, Opera 10 alpha and Chrome 2 beta

In the graph above, the average page-load times for all five web pages have been added together, as have the 95% confidence intervals. All in all, this graph is based on 500 page loads.

Chrome and Firefox are tied for first place – their confidence intervals overlap. Safari and Internet Explorer are tied for third, and Opera is fifth.

Method

For each combination of browser and web site I did a total of 20 page-loads. I measured one web page at a time, working my way through the five browsers.

Since network traffic and page weight can vary over time, I did them in two sets of ten measurements. First I did ten measurements with the browsers in one order: A, B, C, D and E. Then I did ten measurements in the opposite order, starting with browser E. I also rotated the five browsers between A, B, C, D and E for the five different web pages.

Before timing the page-loads, I shift+reloaded (or the equivalent ctrl+reload in IE) the web page ten times to saturate any network cache and to get the browser warmed up. I did this for each browser, before each set of ten measurements. (Ten reloads might sound excessive, but I started off doing only three, which turned out to be too little to reach the shortest load times.)

Between each page-load I cleared all browser data (cookies, cache, etc.). Except for Facebook, where I kept cookies and secure sessions to be able to time the Facebook home page when logged in.

To time the page-loads I used this Javascript page-load timer. As the Microsoft white-paper on testing browsers says, this could introduce an observer effect. But I think we can assume that the Javascript that is being executed is pretty simple and shouldn’t affect the times noticeably.

This test showed that Google Chrome 2 beta is not 100% stable. It hung twice (in 100 page loads) and produced load times of over 30 seconds. I decided to remove these values and replace them with new ones.

Results in detail

In these graphs, each bar shows the average of 20 page-loads. The error bars represent 95% confidence intervals.

Chart or graph showing page-load times for Firefox 3.1 beta 3, Safari 4 beta, Internet Explorer 8, Opera 10 alpha and Chrome 2 beta on youtube.com.

For youtube.com, Firefox and Chrome are tied for first. Safari and Internet Explorer are tied for third. Opera is last.

Chart or graph showing page-load times for Firefox 3.1 beta 3, Safari 4 beta, Internet Explorer 8, Opera 10 alpha and Chrome 2 beta on the Facebook home page.

The Facebook home page loads fastest in Firefox and Chrome, whose confidence intervals only just overlap. The other three browsers are significantly separated.

Perhaps it is the fairly Javascript-heavy nature of Facebook that makes it load so slowly in IE8?

Chart or graph showing page-load times for Firefox 3.1 beta 3, Safari 4 beta, Internet Explorer 8, Opera 10 alpha and Chrome 2 beta on msn.com.

Msn.com: Chrome and Internet Explorer are tied for first. Firefox and Safari are tied for third. Opera is last, again.

Chart or graph showing page-load times for Firefox 3.1 beta 3, Safari 4 beta, Internet Explorer 8, Opera 10 alpha and Chrome 2 beta on a Wikipedia article.

I decided to test the browsers on a long Wikipedia article with lots of images. I looked up Munich, which turned out to be a good candidate.

Chrome and Firefox are tied for first place. Safari is third, Opera fourth and IE fifth.

Chart or graph showing page-load times for Firefox 3.1 beta 3, Safari 4 beta, Internet Explorer 8, Opera 10 alpha and Chrome 2 beta on ebay.com.

Finally, ebay.com: Chrome, IE and Firefox are all tied for first place. Safari is tied with Firefox but slower than Chrome and IE. Opera is last.

Conclusions

Chrome sucks web pages off the Internet like an Electrolux. So does Firefox. In this test I haven’t managed to separate them significantly. As we all can see, Chrome has a lower average sum than Firefox, and perhaps with more data it would be possible to separate them statistically.

Opera is the slowest of the lot, which surprises me. Opera was also slowest in the start-up test. Perhaps though we should cut it some slack – it’s labelled alpha after all. Performance might improve when it reaches beta and final status. Opera also has a turbo feature in the works, but that is kind of cheating since it will lower image quality by tougher compression.

Obviously, this test could be made better in mainly two ways. I could test on more web sites, and I could do more page loads for each web site. But this test was, all in all, 500 timed page-loads and 500 non-timed page-loads. It took me more than a day to complete.

It’s also worth noting that this test is pretty much consistent with Betanews’ page load test, where Chrome 2 beta wins and Firefox 3.1 beta 3 is second.

This test was done with clean browser cache. I’m considering doing the same test but without clearing cache and cookies for each page load. After all, that’s how most page loads are done in the real world. A user who visits any of these five sites will most likely have been there many, many times before. I just need to figure out a good set-up for such a test.

Wednesday, March 25th 2009

Beta Browser Battle: Start-up Times

A few days ago I compared the four different releases of Firefox for start-up time (cold and warm) and page loading time. It got quite a lot of attention so today I decided to compare the five latest preview releases from the big five:

  • Firefox 3.1 beta 3
  • Safari 4 beta
  • Internet Explorer 8 (since there is no IE9 beta)
  • Opera 10 alpha
  • Chrome 2 beta

This time I did things a little more scientifically, following Justin’s suggestion in the comments. I made a batch file for each browser to print the exact time, then launch the browser, opening a page with a script showing the exact time again. The time difference equals the launching time.

It should be noted that this method requires me to opt out of Chrome’s default “new tab” page, with suggested sites. If this affects the results in any real way is unknown, but personally I doubt it. The new tab page in Chrome loads very quickly.

Cold start-ups (directly after booting your computer) are the ones that can feel like an eternity some times. For that reason I think it is more important to have a fast cold start-up than a relatively speaking fast warm one (which generally are about 5-10 times faster anyway). So let’s start with cold start-ups.

I did ten measurements for each browser. A fairly big sample size which gives tiny 99% confidence intervals, which are visible in the graph below.

Graph showing cold start-up times for Firefox 3.1 beta 3, Chrome 2 beta, Safari 4 beta, Opera 10 alpha and Internet Explorer 8.

IE8 is the winner here (2.40 secs), slightly faster than Chrome (2.66 secs). All browsers are, with a 99% probability, significantly different (none are tied). However, this comparison was done on my Windows (XP) computer so IE8 has an unfair advantage – who knows how large part of Internet Explorer is pre-loaded with the operating system? That makes Chrome’s performance all the more impressive.

Safari is marginally faster than Firefox (4.98 vs 5.19 secs). Surprisingly, Opera (7.14 secs) is roughly two seconds slower than Firefox and Safari. I actually thought it would be at least as fast.

Now let’s have a look at warm start-up times. I launched the browsers four times before starting the timing. Then I did 15 measurements for each browser.

Graph showing warm start-up times for Firefox 3.1 beta 3, Chrome 2 beta, Safari 4 beta, Opera 10 alpha and Internet Explorer 8.

Here, Chrome is in a league of its own with an average of 0.247 secs. Firefox and Opera are tied. They took on average 0.530 and 0.531 secs respectively, and their confidence intervals overlap. IE8 averaged 0.575 seconds and Safari came in last with 0.617 seconds.

Conclusion

Chrome impresses the most, even if IE8 launches slightly faster after reboot. Firefox and Safari are pretty similar, while Opera clearly is the slowest for cold start-ups.

These results really explain (and justify) Chrome’s good reputation for speed.

I’m curious if the differences are as large when it comes to page-loading. I’m planning on doing such a comparison too, I just need to work out a good solid method. So stay tuned if you like this kind of stuff.

Saturday, March 21st 2009

Once More: Firefox 3 is Not Bloated

Just in from the insane-browser-prophecies department:

Despite the fact it’s not really ready for human consumption, Chrome has won. Firefox is already dead. The only way the situation can be altered is for Mozilla to slam on the brakes, lean out of the window of the truck, apologize for going the wrong way, and turn around. But that’s unimaginable.

Kier Thomas

*Snicker*

Well, that will only be true if all Firefox users migrate to Chrome. Why would they do that? Chrome does not provide any advantages that seem significant enough for a long-time Firefox user to switch.

If you consider how many Firefox users that have special extensions installed the above scenario seems even more unlikely. Even if Chrome did have equivalents for all the Firefox extensions, it doesn’t provide enough benefits to motivate the hassle of swapping out Firefox.

And what has Mozilla to apologise for? Ripping up Microsoft’s monopoly? Opening the doors for standards based coding?

Kier and others are making out that Firefox has become bloated and slow. So I decided to do a quick comparison of the different releases of Firefox on my four year old AMD 3200+ Windows XP PC.

I installed Firefox 1.0.8, Firefox 1.5.0.12, Firefox 2.0.0.20 and Firefox 3.0.7.

Installation files for Firefox 1, 1.5, 2 and 3.

I set up a clean profile for each version and set them to open a blank page by default. Then I made sure to close all other programs and launched each version of Firefox five times in a row, timing the launches with a stopwatch. (From hitting Enter to seeing the big white browser window.) And then I repeated the whole process so I got ten values for each browser.

I tested my reaction time which was 0.23 seconds on average and subtracted this from all the measured times.

So here is a visual comparison of the launch time for the different versions of Firefox since 1.0 (averages of ten measurements):

Start-up times for Firefox 1.0, 1.5, 2.0 and 3.0 on a Windows XP AMD 3200+ system.

Firefox 1 took just over 0.5 seconds on average. Firefox 3 takes 0.6 seconds. That’s a difference of 0.1 seconds on a four year old system. On a newer system the difference will be even smaller.

I also tested the cold launch times, by rebooting windows between each startup. These are averages of five start-ups:

Cold launch times for Firefox 1.0, 1.5, 2.0 and 3.0

Here the pattern is the opposite. Firefox 2 launches faster than 1.5 which launches faster than 1.0. Firefox 3 launches slightly slower than version 2 but is still a few seconds better than version 1 and 1.5.

I also tested average page loading times for DN.se, a fairly heavy page with plenty of Flash and images.

Average page load times for Firefox 1.0, 1.5, 2.0 and 3.0

Again I did five measurements for each version. Then I repeated the procedure, just to make sure no version was being helped by network caching. (I also loaded the page a few times before starting the test to make sure no browser was disadvantaged by being first.)

Between each page load I cleared all history, browser cache and cookies. So these values should be pretty representative for cold page load times for pages with plenty of images and Flash.

Conclusion

So, Firefox 3 takes roughly 20 percent longer to launch than Firefox 1 for warm starts, which equals at most one or a few tenths of a second. For cold starts (first start after booting your computer) Firefox 3 launches about 30 percent faster than Firefox 1. Also, each page load in Firefox 3 is probably saving you several seconds compared to Firefox 1.

We already know that Firefox’s memory consumption has gone down a whole lot and Javascript speed was improved by a factor of 3 or 4 for version 3.

Bloated?

So could someone explain to me how Firefox 3 is bloated? Is Firefox bloated because it lets you find visited pages easily from the location bar? Is it bloated because it has an industry leading automatic update system? Because it lets you rearrange tabs as you like, because it passes the Acid2 test or because it can remember your tabs from session to session?

Yes, Firefox has added many features since version 1.0. But it just hasn’t gotten bloated in the sense of unnecessary features that get in the user’s way.

Quite the opposite is true in fact. The Firefox developers have thoughtfully added many capabilities to Firefox without forcing mums and grannies to jump through hoops when they want to go on-line. At the same time they have made it load web pages much faster, and cold starts are much quicker. Warm starts are marginally slower.

Monday, January 15th 2007

Quick Sharpness Test of the Canon EF 50mm 1:1.4 USM

In response to this thread over at photo.net I decided to test my copy of the Canon EF 50mm 1:1.4 USM at various apertures.

The question posted in the thread is basically Can the Ef 50mm be used at f/1.4, or is it only sharp at f/2.8?.

Since it is late at night here, my test subject is rather boring. It’s a Volvo brochure laid out flat on my floor:

However, the tiny text is good for showing lens sharpness.

I shot test pictures at f/1.4, f/2.0, f/2.8, f/4.0, f/5.6 and f/8. Here are the obligatory 100% crops. I should add that the images where shot with sharpness set to 3 (out of 7) and contrast at -4 in my 30D.

F/1.4

As you can see, at f/1.4 the edges are slightly soft. The edges are sharper at f/2.0, and even sharper at f/2.8:

F/2.0

F/2.8

From f/2.8 and up, the results are pretty similar:

F/4.0

F/5.6

F/8

So, to answer the question: The Canon EF 50mm 1:1.4 USM is not as sharp at f/1.4 and f/2.0 as it is at f/2.8 and above, but I’d say it is definitely usable.