Leica/CV/Konica lens tests

greenspun.com : LUSENET : Leica Photography : One Thread

The lens test pages that were previously referred to are now up on my web site:

Lens Tests

-- Paul Chefurka (paul@chefurka.com), April 16, 2002

Answers

Dear Paul,

Clicked twice on "Lens Test" and got "Not Found" both times. What gives?

Best,

Alex

-- Alex Shishin (shishin@pp.iij4-u.or.jp), April 16, 2002.


alex, i tried to email you re. waist level finders by clicking on your address....please email me.

-- Steve (leitz_not_leica@hotmail.com), April 16, 2002.

I just clicked on the link above using both Netscape 6.2 and IE 6, and it worked. Try cutting and pasting http://www.chefurka.com/lenstest/lenstest.html

Anyone else having problems?

-- Paul Chefurka (paul@chefurka.com), April 16, 2002.


I read the data easily. No problem w/ the posting.

-- Patrick (pg@patrickgarner.com), April 16, 2002.

I just cut and pasted the URL, and got NOT FOUND. It also disable the BACK button, very annoying.

-- Joe Buechler (jbuechler@toad.net), April 16, 2002.


The main page doesn't disable the "back" button on my browsers, but the sub-pages each come up in a new window, which will have no associated history.

-- Paul Chefurka (paul@chefurka.com), April 16, 2002.

I'm having trouble, in Netscape 4.7x. I get the following error:

The requested URL /portfolio/css/simtxt.css was not found on this server.

It looks like the page is looking for a style sheet (simtxt.css) that it can't find. Can you go into the page and eliminate the reference to it?

For instance, remove or comment out the lines containing:

link rel="stylesheet" href="../../portfolio/css/simtxt.css" type="text/ css"

link rel="stylesheet" href="../css/simoncss.css" type="text/css"

HTH, TW

-- Tse-Sung (tsesung@yahoo.com), April 16, 2002.


In fact, it appears all your pages refer to the non-existent style sheet (simtxt.css); so none of them will open (at least on my browser) unless your remove them these references, or make the CSS available.

-- TSW (tsesung@yahoo.com), April 16, 2002.

Hi Paul,

I've no problem in viewing your site. The tests are interesting. What's the source and how the tests were done? Didn't look at all charts yet but the two for 35/1.4 and 35/2 are really interesting.

I always have the impression that (not yet own the Leica M but soon) other than extra stop, the 35/1.4 slightly underperforms the 35/2 at f2 or smaller stops. Your charts show the opposite that at f2, the 35/1.4 outperforms the little brother in the centre and equals at edge. Peformance, both in centre and edge, is equally outstanding for both lenses at f4 and smaller.

Charts comparing 50/1.4 and 50/2 confirm my understanding as the 1/4 lens suffers significantly at edge sharpness almost at all f stops.

One caveat though - I don't exactly know the rating means but just guess it represents sharpness.

-- Fred Lee (leefred@cadvision.com), April 16, 2002.


Well its seems pretty clear that THE lenses, in terms of CdI tests, are the 28/2.0 Summicron ASPH, either of the 35mm 1.4 or 2.0 ASPH, the 50mm/2.0 Summicron, the 75/1.4 Summilux, the 90/2.0 Summincron AA, and the 135/3.4 Telyt. Unless you want a good slow 50, forget the 3E, and that the Konica and VC lenses just aren't up to the task.

-- Dan Brown (brpatent@swbell.net), April 16, 2002.


The Voigtlander subpage comes up blank.

-- adam g. lang (aglang@hotmail.com), April 16, 2002.

How trustable are this charts Paul?

Interesting to see 50 ´cron and ´lux diferences and how those complement each other, I can´t think of wich of both can be better, they are so diferent.

About Nokton it has nothing to compete with ´lux, IMOO.

I´m begining to like the combo of both 50´s,

-- r watson (al1231234@hotmail.com), April 16, 2002.


OK, it looks like a bit of work is in order on my part. I posted the pages verbatim as I got them from Simon, and it looks like some browsers are choking on them. I'll look at redoing the pages without CSS, and let you know when I've got something else to look at.

The tests are from Chasseur D'Images, and I don't know how they were done. Given that, all you can do is assume that the same test methodology was applied to all lenses, and therefore it's safe to compare one lens to another. You can't get any idea of the absolute quality of any one lens from these charts, though you can get a good idea of the relative performances of different lenses, and some idea of a lens' "fingerprint" in terms of its performance at various apertures.

-- Paul Chefurka (paul@chefurka.com), April 16, 2002.


OK, I've taken out CSS and modified the pages to use fewer tables. Try it now. Lens Tests

-- Paul Chefurka (paul@chefurka.com), April 16, 2002.

My IE 5 browser had no trouble with any of the pages. Probably some sort of MTF test, but could be a resolutuion chart for all we know. The results jive with my experience pretty much, but I have't shot every single lens tested of course. The 90 Elmarit is very close to the more expensive teles in image quality, and along with the 50, has to be the best buys in Leica made M lenses.

-- Andrew Schank (aschank@flash.net), April 16, 2002.


Hi Paul,

I've never read the French magazine, Chasseur D'Images and found the tests on Leica/Konica/Voitglander interesting.

I currently own Contax SLR (manual focus) and G lenses and wonder if there similar tests/charts on these lenses?

-- Fred Lee (leefred@cadvision.com), April 16, 2002.


Fred, you can order the full complement of lens tests, by brand, directly from CdI at their website. ALthough Canon, Leica, Minolta, Pentax, and Sigma dossiers are listed, I didn't notice Carl Zeiss listed.

http://www.photim.com/Cmd4/Article.asp?R=FT-16

-- Brian Walsh (brian.walsh@sbcglobal.net), April 16, 2002.


Paul, thanks for providing the link for these tests. On reading them, it is clear that at least according to CDI, do considerably better than comparable lenses from CV, Konica, and Ricoh, at both large and smaller apertures. In fact, if anything, these competitors' lenses do not do as well, relatively speaking, as Erwin's review would suggest: ie., Erwin rates them higher than CDI.

These independent tests would seem to indicate that Erwin's reviews do NOT overstate the performance of the Leica lenses. I wonder if those who criticize his reviews (but never ever provide any actual data to back up their diatribes) would be willing to apologize. Somehow, I doubt it very much.

-- Eliot (erosen@lij.edu), April 16, 2002.


"I wonder if those who criticize his reviews (but never ever provide any actual data to back up their diatribes) would be willing to apologize. Somehow, I doubt it very much. "

I criticized Erwin, and I did provide data to back it up. I checked his math, found it wanting, and gave hard numbers. Specifically, he was wrong about the speed of the M7 shutter curtain, overstating it by a factor of 10, he was wrong about the usefulness of 1/4000 shutter speed, not understanding that 1/1000 at 5.6 is the same as 1/4000 at 2.8, and he was wrong about shutter lag, stating that digital cameras have a shutter lag of 400-700 milliseconds, when in fact the best digital cameras have lag times of 55 (Canon EOS-1D) and 58 (Nikon D1X) milliseconds.

As for data about lens quality, show my one piece of hard evidence that Erwin has published, apart from specs copied out of a Leica brochure. The reviews I have seen consist of words like "fine detail," "high contrast," "optimum." Data? How can I back up my "diatribe" with data? There is no data. There's nothing to refute. I can't dispute that Erwin thinks a lens is good, or that he thinks one lens is better than another. He does, he says so, it's a fact that he thinks this way, no one argues that he doesn't know what he thinks. Why is disagreeing with Erwin a "diatribe?"

As for these reviews, what do they show? What is being measured? I see graphs with more Erwin-like terms ("outstanding"). This is like reading a weather report that tells you nothing more than "The weather will be shitty tomorrow." What does that mean? Depends on what you think shitty weather is, and if you and the weatherman don't agree, the report is useless.

-- Masatoshi Yamamoto (masa@nifty.co.jp), April 17, 2002.


Paul, sorry, this is a quick note to Steve.

I've changed e-mail addresses from home. (I'm at school with a rare spare moment.) I'll contact you from home so you and everyone can have my new e-mail address.)

Alex

-- Alex Shishin (shishin@suma.kobe-wu.ac.jp), April 17, 2002.


Paul,

Connected. Interesting. Thank you.

Alex

-- Alex Shishin (shishin@suma.kobe-wu.ac.jp), April 17, 2002.


Paul,

Connected with home computer as well. Great.

Alex

-- Alex Shishin (shishin@pp.iij4-u.or.jp), April 17, 2002.


Masatoshi

I doubt there would be many buyers for a book full of MTF curves at all apertures and at a number of distances of Leica lenses. It would be huge and very difficult to compare and very dull- interpretation is always needed. The best way to compare MTF curves is to hold one over another (if you can make the top one transparent as in an acrylic sheet), or by data processing methods. Then you have to compare different apertures and at infinity and close up. The graphs alone can be very difficult particularly when the lenses are very similar. You choose not to believe Puts - fine. Perhaps you can provide us with a better set of data? A few errors does not a fool make. ALL books contain errors of some sort. It is not always sinister. Many of us find his results useful - you seem to have a problem with this. Maybe you should go back to bed and get up again since you seem to have got out of bed the wrong way.

-- Robin Smith (smith_robin@hotmail.com), April 17, 2002.


These tests were interesting but I did find conflicts with many other testers. One main example was the Leica 50mm f1.4 which even Erwin rated optically behind the Voigtlander Nokton 50mm f1.5. (Not buying into another Erwin debate but if he put the Voigtlander lens above a Leica one, I have a tendancy to believe it) But these charts claim the Nokton doesnt even come close. Having compared my Summilux 50 f1.4 to a friends Nokton I had to swallow my pride and concede the Nokton was better in this instance. Since that time I have embraced a few Voigtlander lenses. I went compact (ie CL/CLE's) since then and find the size of the Voigtlanders in a number of focal lengths more suitable.

-- Joel Matherson (joel_2000@hotmail.com), April 17, 2002.

On the Nokton/Summilux issue, one question pops to mind. What's the likelihood of sample variation in a Cosina lens vs. a Leica lens? I'm inclined to believe Erwin as well, so it's possible CdI got a duff Nokton. OTOH, the apparent degree of difference between the two marques in these tests compared to Erwin's results hints that Erwin's tests may narrow the gap a bit.

-- Paul Chefurka (paul@chefurka.com), April 17, 2002.

Paul and Joel. I don't know this for certain, but I suspect, since Erwin always comments on the build, centration, and other mechanical issues that may affect optical performance measurements, Erwin may weed out bad examples of lenses he tests. So I wouldn't be surprised if sample variation contributed to the poor evaltaion of the 50/1.5 Nokton by CDI and the better evaluation by Erwin.

I was introduced to the business of lens to lens variation when Modern Photography a number of years ago published resolution & contrast tests of multiple samples of the same lens (the then current 50/1.4 Nikkor) and found a mind-boggling variation in the performance of different samples of the same lens from the same manufacturer (Nikon). The amount of variation was truly remarkable; and it makes understandable the older practice of professional photographers of testing a number of samples of the same fast lens to pick one that meets their requirements.

Hopefully, Leica's tolerance standards are sufficiently stringent that the chances of getting a lemon (in terms of optical performance if not mechanics) are small. But these condierations do figure into the evaluation of lens testing results.

-- Eliot (erosen@lij.edu), April 17, 2002.


Robin,

My point about the usefulness of the charts is that there is no quantifiable data, only bar graphs linked to subjective terms. What makes one lens "good" and another "outstanding?" Is it 5% better? 25% better? Who knows? Is the difference between different grades noitceable in practical use? Only in a rigorous test? These are reasonable questions.

The "maybe you have something better!" response is childish. It's a bit like saying that news coverage from Afghanistan is wanting, and then being told, "OK, go there yourself and see what it's like!" Or if I say that I'm not a fan of HCB, being told, "Oh yeah? Well how many photos have YOU had published?" Shoot the messenger when you don't like the message.

-- Masatoshi Yamamoto (masa@nifty.co.jp), April 17, 2002.


There are problems with any kind of lens testing--sample variations,and how comprehensive and consistant of a test was done, (resolution chart at one distance/multiple distances, MTF curves, measurments for distortion, testing for several types of flare, color accuracy, "bokeh", etc.). People also get sucked into believing you can quantify lens quality in an over simplified fashion, like an average MTF number, or designation like "good" "excellent" etc., Someone at Photo.net recently wanted to know why if Hasselblad was supposed to have great Zeiss lenses, the MTF numbers were low compared to some inexpensive 35mm lenses. Apples and oranges as they say.

I still believe the best way to tell if a lens is really excellent is to take a bunch of images with your own sample under many different curcumstances and see if it performs to your liking. My 40mm Rokkor didn't do anything special on the resolution chart I tested it on,(it was very good but not "outstanding" -whatever that means) but the real life images I get with it always amaze me with the color, contrast, and level of fine detail rendered-even those taken at f2.0. Same with an old Schneider Xenar on a Rolleicord-fanastic lens! About sample variation, another lens I like alot, the 24 to 120mm Nikkor AFD zoom, has been a great performer for me for several years now- sharp even at the maximum apertures, but it tested out as a dog by a some lens testers and also as the "best wide to tele zoom ever tested" by another.

-- Andrew Schank (aschank@flash.net), April 17, 2002.


Andrew,

I agree with you completely. As an example (here it comes, real "data" for those who demand it), I have the Konica 50mm f2 Hexanon and the Leica 50mm f2 Summicron (latest version). I would bet that the Leica would have better numbers on MTF charts, resolution tests. It is very sharp. But the Konica lens takes nicer pictures for my purposes. It is "outstanding" for me. It may not be for you.

-- Masatoshi Yamamoto (masa@nifty.co.jp), April 18, 2002.


Everyone!

Paul Mentioned that these charts came from Chasseur d'Images. C.d'I. is quirky, which makes them fun. They rate lenses in terms of technical aspects and how they happen to like them (Note technique and Cote d'amour). Now here is an example of a little surreal fun. the Hexanon 28/2.8 gets five stars in both catagories. The Leica Elmarit gets four stars in both catagories. But if you check their charts, translated in Paul's lens tests, you'll see that the Leica lens outperforms the Hexanon. So what gives?

I test I have some questions about in that of the VC Ultron 35/1.7. A Japanese journal's field tests clearly show that it is a very good to excellent performer wide open in night / city lights situations. It's commentary praises the lens highly in terms of its sharpness and signature. Yet we see by the C 'I charts that it is a poor performer wide open. Again what gives? (I am not being rhetorical; I just don't know.)

Other tests confirm Erwin Putts's tests (the Leica 28/2 vs UC 28/1.9 for example).

But it is always fun to look at lens tests. So thanks again, Paul.

Best,

Alex

-- Alex Shishin (shishin@pp.iij4-u.or.jp), April 18, 2002.


Let's not forget that it was Simon Alibert who went to the trouble of creating these pages - I just decided to host them. I have less of a conscience about copyright violations, and my brother-in-law is a lawyer, so I can get a good price for my defense if CdI sends their pinstripe brigade after me :-)

So just to set the matter straight - thank you Simon. I think your pages even silenced Robert Monaghan over on rec.photo.equipment.35mm, and that takes a lot!

-- Paul Chefurka (paul@chefurka.com), April 18, 2002.


Masatoshi

Well my point was that you were aggressively criticising Puts and seemed to be unaware of how difficult this whole area is. No one else we know has made any attempt to do what he has done - it is very easy to criticise, but in 20 years (if at all) no one has done what he has done. Also you have just fallen into the same trap that you accuse him of doing. Your assessment shows no data whatsoever, just a "it's very good" type of statement - not a lot of use as I have no idea what your standards are. From reading Puts's book I have a pretty good idea of his standards.

-- Robin Smith (smith_robin@hotmail.com), April 18, 2002.


...thought you chaps might be interested...Why you can't trust lens tests! Bob Atkins

---------------------------------------------------------------------- ---------- Lots of magazines and a few newletters publish the results of lens tests. Even some individuals post results of lens tests to the rec.photo newsgroups. The big question is just how good are these tests and how much can you depend on the numbers they give? First let's look at what you would need to do a good lens test -

(1) A selection of randomly purchased lenses of the same type. You would need to obtain several lenses in order to see what the average quality was and make sure that there isn't a large lens to lens variation. The lenses should be from different batches, purchased from different stores at different times to obtain as random a sample as possible. Most magazines, newsletters and individuals just don't do this. Popular Photography tests just one lens - they even list the serial number in the test. Statistically, unless you are sure that lens to lens variation is small (and how do you know this if you only test one lens), the results of testing a single sample are unreliable as a predictor of what you might expect if you bought a similar lens.

If you don't believe me, here is a direct quote from Popular Photography (January 1994, page 44): "Since optics and precision instruments vary from unit to unit, we strongly suggest that readers carry out their own tests on equipment they buy"

(2) A good lens testing methodology. You would need to test each lens in the same way, with the same equipment, and assess the results of each test using the same criteria. Furthermore, to compare lens tests done by different magazines, each would need to follow some standard test procedures. There is no ISO (or other) standard for lens testing. Furthermore, some magzines rate a lens by a single number, some by a letter grade, some by a set of numbers and so on. Comparing test results from diferent magazines is like comparing apples to oranges. Not only are their test methods different, but their method of reporting and presenting their conclusions are different. In addition, just what does a single letter grade or number represent. A "B" or a "9" might mean superb resolution, but some flare and vignetting, or it could mean low flare, high contrast but only good resolution. Thus 2 lenses getting the same "grade" could, in fact, perform differently. When a single number is given to represent *all* aspects of optical performance there is always room for confusion and uncertainty. (3) A lack of bias Ideally, whoever interprets the raw test data shouldn't know which lens the data come from. In the case of magazines, there is a reluctance to print a bad review of a lens from a major advertiser. I don't know if it's true, but I was told by someone who worked for a photo magazine several years ago that if they got bad numbers on a lens, they would retest another sample. If the results were better, they would publish the good numbers, if they were still bad, they wouldn't publish the test at all. With individuals, there is always a temptation to give a lens the benefit of any doubt when it cost you a lot of money. You want the lens to test well. Even if this is subconscious, it can still affect judgement. I believe Popular Photography once said (and correct me if I'm wrong) that they never published a bad test report because people weren't intersted in bad lenses. Almost every lens gets a rating of "average" or better. Clearly "average" doesn't mean average in the sense that the lens is better than 1/2 the other lenses and worse then the other 1/2. It appears to mean "adequate", i.e. a lens that will not be regarded as "bad" by the typical reader of the magazine.

(4) Actually testing the lens! Some magazines, like Outdoor Photographer, Petersen's PhotoGraphic and Shutterbug, publish "reviews" or "user reports". These are "lens tests" based on subjective judgements. Such judgements may be fine, but you have no idea at all whether or not they are. You have no idea of the standards, experience or knowledge of the tester, nor if they have any bias (if they wrote a review which "panned" a lens, they wouldn't get it published, so there is a strong incentive to say something good, or at least not to say anything bad). ---------------------------------------------------------------------- ----------

So what good are lens tests? Well, even given all the failings listed above, if most of the reviews of lens A suggest it is better then lens B, then it probably is (given you know what "better" implies!). Just make sure that the magazines all tested the same lens (not versions 1, 2 and 3 over a 10 year time frame). By "all" the magazines, I mean the US magazines, British magazines, French magazines, German magazines, Swedish magazines and so on. If you depend on US magazines, only Popular Photography even claims to do objective testing based on a scientific method, the other magazines just give "user reports". A few newsletters, like George Lepp's "Natural Image" do semi-scientific testing under somewhat controlled conditions. The fact that, on occasion, there are very significant differences between different magazines tests of nominally the same lens should give you food for thought and indicate that depending on a single review isn't a very good idea. A good compilation of lens test results (for Canon EOS and 3rd party lenses) can be found at http://www.cmpsolv.com/photozone/lenses.htm If you search through magazine reviews you can often find examples of contradictory test data on the same lens. For example FotoMagazin rates both the Sigma and Canon 300/2.8 lenses optically equal (9.6 out of 10), whereas Chasseur d'Image gives the Canon lens a "****" optical rating (very good), but the Sigma lens only a "**" optical rating (average). Two stars (**) is pretty bad. There are no one star lenses!! 4 stars (****) is very good, there are very few 5 star lenses. You would get a very different view of the Sigma lens if you only read FotoMagazin than you would if you only read Chasseur d'Image. On this occasion I tend to think Chasseur d'Image is closer to the truth but who can say for sure?

---------------------------------------------------------------------- ----------

Believing other people's opinions The least reliable of all "tests" are the comments in Usenet newsgroups to the effect that "I have that lens and it's really great". You simply have zero idea what the poster considers great. He/she might be judging the lens on the basis of 3.5x5" prints from the local supermarket. They may have just moved up to 35mm from a disk camera. Their eyesight may be so bad that any recognizable image is "great". On the other hand it's possible that they are professional photographers, shooting on a daily basis for critical photo editors. Unless you have some idea of what the standards of the person making the comment are, claims that the optical performance any given lens is "good", "bad", "fair" etc. don't carry much weight. There are, of course, some very knowledgable contributors to the rec.photo Usenet groups. You just have to figure out who they are!



-- Yogesh Jeram (Yogeshjeram@hotmail.com), April 18, 2002.


Bob Atkins vents more hot air than all of us put together and loves the sound of his own keyboard, but he talks some sense here!

-- Robin Smith (smith_robin@hotmail.com), April 18, 2002.

> They rate lenses in terms of technical aspects and how they happen to like them

Isn't price also a rating consideration for that magazine? Seems to me I read that somewhere other, but of course I might be wrong.

-- John Hicks (jhicks31@bellsouth.net), April 18, 2002.


Moderation questions? read the FAQ