Very few people understand the real reason why Kodak failed to dominate the digital world. It actually dates back to a 300 million dollar lawsuit which Honeywell won against Minolta over a patent for an autofocus system. The Japanese camera companies were so outraged over this perceived injustice that they vowed never to engage in technology sharing with American companies ever again and this ended up crippling Kodak's ambitions.
Digital photography is one of those innovations that wonderfully ages pre-1990s science fiction where people of distant futures still fiddle with film chemicals.
The first book of David Brin’s Uplift series was written in 1980 and takes place on an antigravity spaceship carrying alien ambassadors that can penetrate deep into the Sun. Yet one of the major plot points is someone using the onboard darkroom to develop pictures that reveal something essential.
I’m hoping someone would make a new sci-fi movie with a vintage aesthetic that would intentionally emphasize and magnify this old-school analog awesomeness of galactic empires that seem to entirely lack integrated circuits. Apple TV’s “Silo” has a wonderful production design but it’s too claustrophobic to fulfill my wish.
“The Mote in God’s Eye” would be my pick if I could get any IP developed with this approach.
Battlestar Galactica (2004) has an aesthetic like that. While they do use mainframe computers, they avoid all networking due to the risk of being hacked. Galactica is the only Battlestar to survive the first episode specifically because it’s the only one that still uses this outdated technology.
Plus, it’s just one of the best TV shows ever made in any genre.
> I'm hoping someone would make a new sci-fi movie with a vintage aesthetic that would intentionally emphasize and magnify this old-school analog awesomeness of galactic empires that seem to entirely lack integrated circuits.
This is what I hoped for Foundation, to replicate the 1940s now-retrofuturism I imagine while reading the books. Alas, it wasn't to be.
Sometimes I think a lot of myself. Sometimes I don't. During the times I do, I console myself about my lack of success by thinking that I have never been in the right place at the right time.
But had I been in that place at that time, I would not have invented the digital camera. That guy Sasson was clearly capable far beyond the rest of us.
He is an exceptional engineer. In 1986 I developed an instant messaging system that worked across the internet on X Windows. It was very popular at HP where I worked. It ran on X Windows and had many of the features of modern messaging systems like WhatsApp. I didn’t think twice about it. A few years later I saw how these apps took the world by storm once the internet became popular. I think I had caught lightning in a bottle but didn’t appreciate it. It’s kind of the opposite of Sasson. However in both cases we were lousy evangelists. Also I’m not an exceptional engineer.
That's a poignant observation. There are "times and places" for things. And whether you or I would have been "the right person" at that time is hard to know.
I consider Wozniak (obvious example) who was at the "right time and place" in the early 1970's. He at the engineering capital of the U.S. (Silicon Valley — already known by that name at the time) knowing adults in engineering fields that could get him otherwise expensive and new for the time microprocessor chips… just as the chips were becoming more affordable—just as Don Lancaster's "TV Typewriter" and the "Altair 8800" began to grace the cover of Popular Electronics…
Woz seemed to flounder, or be overwhelmed somewhat, a decade later when hacks with a 555 Timer chip, a few NAND gates or NTSC timing hijinks to get color was not where the industry was going. He took a back-seat on the engineering side.
At the same time, not to diminish Woz's skills in 1975, there were a lot of other smart kids in the "Valley" then that did have their home-brew computers become a product.
(And then so much more to unpack when you allow for Job's contributions, U.S. schools purchasing Apple computers, etc.)
A very similar PetaPixel article with a couple more technical details: [1] In particular, it describes the reason for the first corrupted image – they had wired the four-bit output in the wrong order so that the high bit was the lowest and vice versa. Thus, all-ones still looked white and all-zeros black, but the rest of the shades were scrambled.
So wild. The wire-wrap boards are truly frightening to look at.
And the photos in the article of the old "instamatic" Kodak film cameras (especially that 110 pocket camera) suddenly brought back to my mind that formaldehyde-like smell of developer chemicals when I worked at a One-Hour-Photo lab when in high school.
I wonder if any exist on the internet and if the camera is still functional.
Edit: it's very likely that no photos exist because the tapes were being reused and there are many reasons why the camera has been nonfunctional for a long time now.
Yeah, the camera probably hasn't been in functioning condition for decades and people at Kodak likely didn't see much historical value in archiving those tapes.
I don't doubt this description of what happened, but the sad irony in a company whose product was producing tools to generate archival copies of images, not recognising the value of retaining archival copies of images... facepalm.
The popular notion that "Kodak invented the thing that killed them" is basically nonsense.
Steve Sasson's tale of technical struggle in 01975 at Kodak is real, but dozens of other people were doing the same thing at the same time at different companies, or in their dormitories, because at that point the problem of building a handheld digital camera had been reduced to a problem that one guy could solve with off-the-shelf parts. In fact, earlier the same year, a digital camera design was published as a hobbyist project in Popular Electronics, using a 32×32 MOS sensor, and commercialized as the Cromemco Cyclops. (You just had to keep it plugged in; you couldn't take it with you to the Little League game, even though it was small enough to lift in one hand.) https://en.wikipedia.org/wiki/Cromemco_Cyclops
The reduction of the problem to such a manageable size was the result of numerous small advances over the previous 50 years.
> It quickly became apparent that the digital image data, acquired by the MSS (Multispectral Scanner) instrument, a whiskbroom scanning device, were of great value for a broad range of applications and scientific investigations. For the first time, the data of an orbiting instrument were available in digital form, quantified at the instrument level - providing a great deal of flexibility by offering all the capabilities of digital processing, storage, and communication.
Landsat 1 was built by General Electric, RCA, NASA, and subcontractors, and the MSS digital camera component in particular was designed by Virginia Norwood at the Hughes Aircraft Company, not at Kodak.
Ranger 7 in 01964 https://en.wikipedia.org/wiki/Ranger_7 was an electronic camera that was successfully launched into the moon and returned close-range photos of it over radio links, but, as far as I can tell, it wasn't a digital camera; the RF links were analog TV signals.
The first experimental digitization of a signal from an electronic camera was probably done by Frank Gray at Bell Labs, not at Kodak, in 01947, for which he invented the Gray Code. To be able to keep up with live full-motion video data, his analog-to-digital converter was a sort of cathode-ray tube with a shadow mask in it with the code cut into it; this is described in patent 2,632,058, granted in 01953: https://patentimages.storage.googleapis.com/a3/d7/f2/0343f5f....
The video camera tubes that were the only way to build electronic cameras up to the 50s, and which made the cameras large and heavy, were supplanted by CCDs like the 100×100 Fairchild MV-101 that Sasson used in his prototype at Kodak. The CCD was developed by Smith and Boyle at Bell Labs, not at Kodak, in 01969–70: https://en.wikipedia.org/wiki/Charge-coupled_device
However, any DRAM chip is also an image sensor, which is why they are encapsulated in black epoxy to prevent them from sensing light; without the CCD, we would have had CMOS image sensors anyway just because of the light-sensitivity of silicon. In fact, the Cromemco Cyclops used just such a chip.
The fundamental thing that made digital cameras not just possible but inevitable was microelectronics, a technology which owes its existence in 01975 to a long series of innovations including the point-contact transistor (Bardeen and Brattain, 01947, Bell Labs, not at Kodak); the junction transistor (Shockley, 01948, Bell Labs, not at Kodak); the monolithic integrated circuit (Noyce, 01959, Fairchild Semi, not at Kodak); the planar process (Hoerni, 01959, Fairchild Semi, not at Kodak); the MOSFET (Kahng and Atalla, 01959, Bell Labs, not at Kodak); the self-aligned silicon gate (Faggin, 01968, Fairchild Semi, not at Kodak); and, as mentioned in the article, the microprocessor. The microprocessor was overdetermined in the same way as the handheld digital camera, and arose basically simultaneously at RCA, Motorola, TI, and Intel, but whoever we decide invented the microprocessor, it certainly wasn't done at Kodak.
Kodak should have ruled the digital imaging space. Instead, they collapsed.
A lot of it was because the film people kneecapped the digital folks.
Film was very profitable.
Until it wasn't.
The company that I worked for, was a classic film company. When digital was first getting a foothold (early 1990s), I used to get lectures about how film would never die...etc.
A few years later, it was as if film never existed. The transition was so sudden, and so complete, that, if you blinked, you missed it.
Years later, I saw the same kind of thing happen to my company, that happened to Kodak.
The iPhone came out, with its embedded camera, and that basically killed the discrete point-and-shoot market, which was very profitable for my company.
When the iPhone first came out, the marketing folks at my company laughed at it.
>"But Joy had followed me back because she was curious, you know, and she was standing in the hallway. We turned around, and Joy says: 'Needs work,' and turned out and walked away."
This part reminded me of the Black Triangle (2004):
That’s not an accurate summary of the article. The problem was that Kodak stuck to the photography business for too long. As the article states, in the early 2000s they were the number one seller of digital cameras. It just turns out making consumer digital cameras was a “crappy business” as their CEO went on to say. Fujifilm diversified into healthcare, cosmetics, and making LCD display films.
I worked for a company that was beautifully run with great, smart, hardworking people, led by someone that had been with the technology since the beginning. We almost immediately got acquired by a public company that used different technology that saw us as a threat, and the founders were retained long enough to see their company and workers basically trashed into a mediocre state.
This is a very common story from what I understand, whether the intent is either “if you can’t beat them, buy them!” or even if it’s just to grow.
In Kodak’s case, I wonder if both those that saw it as the future and those that saw it as the end wanted to support and control it.
Also, it never ceases to amaze that some of the best things and the most dangerous things are (1) not those that you planned on and (2) involve someone bending and breaking rules to persue a passion project.
I mean, it was one of those inevitable technologies.
Other companies had already invented the CCD, it was only a matter of time before someone would digitise the signal and pair it with a storage device. It was an obvious concept.
All Kodak really did was develop an obvious concept into a prototype many years before it could be viable, and then receive a patent for it.
If this were really the case, I'm surprised the US government didn't engage in antitrust action.
The first book of David Brin’s Uplift series was written in 1980 and takes place on an antigravity spaceship carrying alien ambassadors that can penetrate deep into the Sun. Yet one of the major plot points is someone using the onboard darkroom to develop pictures that reveal something essential.
I’m hoping someone would make a new sci-fi movie with a vintage aesthetic that would intentionally emphasize and magnify this old-school analog awesomeness of galactic empires that seem to entirely lack integrated circuits. Apple TV’s “Silo” has a wonderful production design but it’s too claustrophobic to fulfill my wish.
“The Mote in God’s Eye” would be my pick if I could get any IP developed with this approach.
Plus, it’s just one of the best TV shows ever made in any genre.
This is what I hoped for Foundation, to replicate the 1940s now-retrofuturism I imagine while reading the books. Alas, it wasn't to be.
But had I been in that place at that time, I would not have invented the digital camera. That guy Sasson was clearly capable far beyond the rest of us.
I consider Wozniak (obvious example) who was at the "right time and place" in the early 1970's. He at the engineering capital of the U.S. (Silicon Valley — already known by that name at the time) knowing adults in engineering fields that could get him otherwise expensive and new for the time microprocessor chips… just as the chips were becoming more affordable—just as Don Lancaster's "TV Typewriter" and the "Altair 8800" began to grace the cover of Popular Electronics…
Woz seemed to flounder, or be overwhelmed somewhat, a decade later when hacks with a 555 Timer chip, a few NAND gates or NTSC timing hijinks to get color was not where the industry was going. He took a back-seat on the engineering side.
At the same time, not to diminish Woz's skills in 1975, there were a lot of other smart kids in the "Valley" then that did have their home-brew computers become a product.
(And then so much more to unpack when you allow for Job's contributions, U.S. schools purchasing Apple computers, etc.)
[1] https://petapixel.com/how-steve-sasson-invented-the-digital-...
https://petapixel.com/what-is-ccd-cmos-sensor/
and https://www.teledynevisionsolutions.com/learn/learning-cente...
And the photos in the article of the old "instamatic" Kodak film cameras (especially that 110 pocket camera) suddenly brought back to my mind that formaldehyde-like smell of developer chemicals when I worked at a One-Hour-Photo lab when in high school.
Edit: it's very likely that no photos exist because the tapes were being reused and there are many reasons why the camera has been nonfunctional for a long time now.
Steve Sasson's tale of technical struggle in 01975 at Kodak is real, but dozens of other people were doing the same thing at the same time at different companies, or in their dormitories, because at that point the problem of building a handheld digital camera had been reduced to a problem that one guy could solve with off-the-shelf parts. In fact, earlier the same year, a digital camera design was published as a hobbyist project in Popular Electronics, using a 32×32 MOS sensor, and commercialized as the Cromemco Cyclops. (You just had to keep it plugged in; you couldn't take it with you to the Little League game, even though it was small enough to lift in one hand.) https://en.wikipedia.org/wiki/Cromemco_Cyclops
The reduction of the problem to such a manageable size was the result of numerous small advances over the previous 50 years.
Landsat 1 was a digital camera that was initially planned in 01970 and launched into space in 01972; it just weighed a tonne, so you couldn't hold it in your hand. https://directory.eoportal.org/satellite-missions/landsat-1-... says:
> It quickly became apparent that the digital image data, acquired by the MSS (Multispectral Scanner) instrument, a whiskbroom scanning device, were of great value for a broad range of applications and scientific investigations. For the first time, the data of an orbiting instrument were available in digital form, quantified at the instrument level - providing a great deal of flexibility by offering all the capabilities of digital processing, storage, and communication.
Landsat 1 was built by General Electric, RCA, NASA, and subcontractors, and the MSS digital camera component in particular was designed by Virginia Norwood at the Hughes Aircraft Company, not at Kodak.
Ranger 7 in 01964 https://en.wikipedia.org/wiki/Ranger_7 was an electronic camera that was successfully launched into the moon and returned close-range photos of it over radio links, but, as far as I can tell, it wasn't a digital camera; the RF links were analog TV signals.
Handheld electronic cameras, for a very strong person, might date back to Philo T. Farnsworth's Image Dissector in 01927 https://en.wikipedia.org/wiki/Video_camera_tube#Experiments_... or Zworykin's Iconoscope in 01933 https://en.wikipedia.org/wiki/Video_camera_tube#Iconoscope, but in practice these were only reduced to handheld-plus-backpack size in the 01950s https://en.wikipedia.org/wiki/Professional_video_camera#Hist.... Farnsworth was at the Farnsworth Television and Radio Corporation, not at Kodak. Zworykin was at Westinghouse and RCA, not at Kodak.
The first experimental digitization of a signal from an electronic camera was probably done by Frank Gray at Bell Labs, not at Kodak, in 01947, for which he invented the Gray Code. To be able to keep up with live full-motion video data, his analog-to-digital converter was a sort of cathode-ray tube with a shadow mask in it with the code cut into it; this is described in patent 2,632,058, granted in 01953: https://patentimages.storage.googleapis.com/a3/d7/f2/0343f5f....
The video camera tubes that were the only way to build electronic cameras up to the 50s, and which made the cameras large and heavy, were supplanted by CCDs like the 100×100 Fairchild MV-101 that Sasson used in his prototype at Kodak. The CCD was developed by Smith and Boyle at Bell Labs, not at Kodak, in 01969–70: https://en.wikipedia.org/wiki/Charge-coupled_device
However, any DRAM chip is also an image sensor, which is why they are encapsulated in black epoxy to prevent them from sensing light; without the CCD, we would have had CMOS image sensors anyway just because of the light-sensitivity of silicon. In fact, the Cromemco Cyclops used just such a chip.
The fundamental thing that made digital cameras not just possible but inevitable was microelectronics, a technology which owes its existence in 01975 to a long series of innovations including the point-contact transistor (Bardeen and Brattain, 01947, Bell Labs, not at Kodak); the junction transistor (Shockley, 01948, Bell Labs, not at Kodak); the monolithic integrated circuit (Noyce, 01959, Fairchild Semi, not at Kodak); the planar process (Hoerni, 01959, Fairchild Semi, not at Kodak); the MOSFET (Kahng and Atalla, 01959, Bell Labs, not at Kodak); the self-aligned silicon gate (Faggin, 01968, Fairchild Semi, not at Kodak); and, as mentioned in the article, the microprocessor. The microprocessor was overdetermined in the same way as the handheld digital camera, and arose basically simultaneously at RCA, Motorola, TI, and Intel, but whoever we decide invented the microprocessor, it certainly wasn't done at Kodak.
A lot of it was because the film people kneecapped the digital folks.
Film was very profitable.
Until it wasn't.
The company that I worked for, was a classic film company. When digital was first getting a foothold (early 1990s), I used to get lectures about how film would never die...etc.
A few years later, it was as if film never existed. The transition was so sudden, and so complete, that, if you blinked, you missed it.
Years later, I saw the same kind of thing happen to my company, that happened to Kodak.
The iPhone came out, with its embedded camera, and that basically killed the discrete point-and-shoot market, which was very profitable for my company.
When the iPhone first came out, the marketing folks at my company laughed at it.
Then, they stopped laughing.
This part reminded me of the Black Triangle (2004):
https://archive.ph/qqOnP
https://news.ycombinator.com/item?id=698753
https://petapixel.com/why-kodak-died-and-fujifilm-thrived-a-...
TL;DR: Fujifilm diversified quickly, Kodak clung to the film business for far too long.
This is a very common story from what I understand, whether the intent is either “if you can’t beat them, buy them!” or even if it’s just to grow.
In Kodak’s case, I wonder if both those that saw it as the future and those that saw it as the end wanted to support and control it.
Also, it never ceases to amaze that some of the best things and the most dangerous things are (1) not those that you planned on and (2) involve someone bending and breaking rules to persue a passion project.
Other companies had already invented the CCD, it was only a matter of time before someone would digitise the signal and pair it with a storage device. It was an obvious concept.
All Kodak really did was develop an obvious concept into a prototype many years before it could be viable, and then receive a patent for it.