iPhone: sweet solution, evolution and disruption

When the iPhone was announced in 2007 Steve Jobs introduced it by saying that Apple was introducing three revolutionary products:

  1. Widescreen iPod with touch controls
  2. Revolutionary mobile phone
  3. Breakthrough internet communications device

At the time this seemed to cover the main functions that people would find useful on mobile phones, applications or “apps” were not yet a thing that most people cared about as most mobile phones of the preceding era had pretty clunky interfaces and getting hold of apps was controlled pretty firmly by the mobile networks.

The iPhone didn’t initially support a way for developers to make native apps, instead offering only the “sweet solution”1 of web apps in Safari. Before too long a native SDK was released in 2008, the App Store opened for business in July of that year and allowed developers to release native apps, and the beginnings of the initial app boom era started.

So those “three revolutionary products” were joined by a fourth one: a “handheld game console“. In December 2008 Apple announced the top 10 paid apps in the App Store, of those 10 the top 7 of them were games:

  1. Koi Pond2
  2. Texas Hold’em
  3. Moto Chaser
  4. Crash Bandicoot: Nitro Kart 3d
  5. Super Monkey Ball
  6. Cro-Mag Rally
  7. Enigmo
  8. Pocket Guitar
  9. Recorder
  10. iBeer
This image shows app icons from these 10 iPhone apps: Koi Pond, Texas Hold’em, Moto Chaser, Crash Bandicoot: Nitro Kart 3d, Super Monkey Ball, Cro-Mag Rally, Enigmo, Pocket Guitar, Recorder, iBeer.
These are a blast from the past!

One notable thing about the first iPhone that may seem odd now was that the camera was only a stills camera, no video support, no flash, no exposure compensation; not only that but the camera used a tiny 2 megapixel sensor. Prior to the iPhone release phones such as the 2006 Nokia N95 came with a 5MP camera, supported video and had a flash. So this aspect of the iPhone was really only providing the bare-bones, minimal requirements for taking photos. You definitely couldn’t add “Digital SLR quality camera” to that initial trifecta of devices that Steve introduced the iPhone with.

Fast-forward to 2024, the iPhone cameras have improved amazingly, the front-facing “selfie” camera alone on the iPhone 16 (first added with the iPhone 4) now takes 12MP photos and can shoot 4K video. Depending on which version of the phone you get two or three main cameras, with the iPhone 16 Pro sporting 48MP sensors with regular, ultra-wide and telephoto lenses. The iPhone is now one of the most popular cameras in the world, not just “phone cameras” but cameras in general. The iPhone, along with other smartphones, disrupted the camera industry3, even with the much poorer quality cameras in the earlier phones.

Whilst the camera(s) on the iPhone have been incrementally improving, year-over-year with each new model, this year sees an interesting new addition to both the regular and pro versions of the phone; the new “Camera Control” button. The Camera Control button is found on the side of the phone and is specifically for controlling the camera functions and provides a way to quickly access camera tools like exposure, depth of field and zoom4. The button is both a physical button and touch-sensitive control so you can press and slide sideways to navigate, adjust and select settings.

Apple has been touting their new “Apple Intelligence” AI-based features along with the release of these new phones, but I think the Camera Control is the most interesting feature on these new iPhones5. The camera has been an important feature of the iPhone for some time now and has become many people’s only camera that they use for taking photos and shooting video (even being used for making actual movies). But the addition of the Camera Control seems significant to me, in that Apple increasingly views part of its core functionality to be an actual camera, not just a phone with a camera on it.

So it seems that alongside the earlier addition of “handheld game console” to the initial three products that comprised the first iPhone we can now definitely add “High resolution mirrorless camera” to that list of “products” that makes up the iPhone.

Steve Jobs had a clear idea of the three core products that the iPhone was going to disrupt when he introduced the original iPhone, but I think the introductory slide would be something more like this had he envisaged gaming and photography becoming such a core function of the device6:

This image shows Steve Jobs on stage introducing the original iPhone with a slide saying "iPod", "Phone", "Internet" and the additions of "Gaming" and "Camera".

  1. MacStories wrote a good article looking back at the “sweet solution” back in 2018. ↩︎
  2. Ok, some might nit-pick that Koi Pond isn’t a game, but it’s definitely not a productivity app :) ↩︎
  3. Estimated to be an 87% drop in digital camera sales since 2010! ↩︎
  4. At the time of writing Apple has said additional functionality is coming in a future software update: “Camera Control will introduce a two-stage shutter that lets you automatically lock focus and exposure with a light press“. ↩︎
  5. Also, Apple Intelligence technically isn’t out at the time of writing, only with iOS 18.1 and subsequent point updates will it incrementally become available. ↩︎
  6. Is a “phone” still a core part of a smartphone device now? Arguably “internet communication” has replaced much of the traditional phone usage now. ↩︎

Apple <-> elppA

Apple had their September “It’s Glowtime” keynote event yesterday announcing updated products like Apple Watch, AirPods and iPhones.

The new iPhone Pro phones have a new cinematic slow motion filming mode which records at 120FPS 4K video, they demonstrated this mode with a music video for the band “The Weeknd” where they had them sing the song at 4x so that when it is played at the slow motion 24FPS rate the vocals are played in realtime:

(There’s also another behind the scenes video with a bit more about how it was shot).

It reminded me of the video for the song “Drop” by The Pharcyde which was directed by Michel Gondry. The whole video was shot with them performing everything backwards, so the band had to learn to sing the words as they would sound in reverse so that when the video is played in reverse it looks like they’re singing normally, kind of wild!

(Here’s a link to watch the Pharcyde “Drop” video on YouTube in case the embed isn’t working properly).

11 years of iPhone images

Back in 2011 I started compiling short videos with all of the images that had been on my phone over the year. I did 2010 through to 2015 and then after that year I just never got around to compiling the following years. But with a welcome break over the holidays and 2020 over and done I finally made time to compile the missing years videos. In some of the previous years I’d made a custom audio track, but for these ones I just looked for suitable tracks amongst YouTube’s free audio, I think they work quite well though.

I’m tempted to make one massive video with 11 years of video but I think this may be a bit too long! :) Here’s all 11 videos so far:

iPhone view of 2020

iPhone view of 2019

iPhone view of 2018

iPhone view of 2017

iPhone view of 2016

iPhone view of 2015

iPhone view of 2014

iPhone view of 2013

iPhone view of 2012

iPhone view of 2011

iPhone view of 2010

GoPro Hero 3 – 720p 120fps slowmo After Effects test

I have finally got myself a GoPro! This is a couple of test clips with footage shot in 720p 120fps mode on the GoPro Hero 3, then slowed down further in After Effects CS6 using Timestretch and Timewarp.

A bit glitchy due to the flickering from light behind it, I’ll try this again in bright daylight and compare results.

↵ Use original player
Vimeo
← Replay
X
i

Music / Video formats uncovered articles by Shedworx

I’ve written quite a few posts about the various products that the guys over at Shedworx develop and that have been useful to me whilst working with video (AVCHD video in particular!). They’ve written a couple of articles which are great primers for people new to working with audio and video.

The articles explain the various aspects of audio and video such as Codecs and Bit Rates and are well worth reading, especially if you’ve never given much consideration to the output of your video or audio and have just gone for defaults in apps such as iMovie etc. Even if you’ve been working in these areas and have a good knowledge of this stuff they’re still worth a read.

Apple previews Final Cut Pro X at NAB in Las Vegas

Apple made their presence felt at the NAB show in Las Vegas with a preview of the next version of their Final Cut Pro video editing app, Final Cut Pro X.

In what is no doubt a long overdue update Final Cut Pro X brings what a lot of Pro video users have been waiting for: a totally redesigned interface, 64-bit memory addressing, multi-processor support, background rendering (no more render window), GPU rendering, use of more than 4GB RAM, video sizes from standard def up to 4k, real-time native format video processing – that’s quite a list!

The other announcement that is pretty awesome – a new price: $299 (not sure what that’ll be in £GBP yet, about £199 I’d guess) and, similar to Apple’s Aperture software which had a price drop recently, Final Cut Pro X will be available through Apple’s Mac App Store.

Native AVCHD?

An often featured subject in blog posts here on Suburbia is the AVCHD file format. Previously AVCHD required transcoding into some other format such as ProRes in order to edit it in Final Cut Pro, so a good question is whether the new ‘real-time native processing’ feature means that this transcoding will be a thing of the past? This is a feature that Adobe’s Premiere Pro has had for a while, albeit one that requires quite powerful hardware to make use of it. It will be interesting to see how Final Cut Pro X compares with this.

June launch date

Final Cut Pro X is due to be launched through the Mac App Store in June. In the mean time you can find out more about the new features via a couple of YouTube clips filmed at the preview announcement:

Adobe Labs release Wallaby Flash to HTML5 converter

Adobe have released an interesting experimental tool called ‘Wallaby‘ over on Adobe Labs. It’s basically an app that tries to convert Flash FLA source files into ‘HTML5’ compatible animations. Note I’m using ‘HTML5’ in quotes as usage of the term here is more in the overall grouping of HTML5 and related technologies such as CSS3, JavaScript / jQuery, SVG etc. It’s worth reading over the Release Notes to get an idea of the current limitations, the biggest issues at the moment being WebKit browser support only and no conversion of actionscript or sound within FLAs.

I just happened to be playing around with Flash this morning messing around doing a little shape tweening with a couple of symbols I found over on The Noun Project website so I thought I’d use that FLA file and see how Wallaby coped with it.

Running Wallaby:

After starting Wallaby you get a very simple interface, a file select field that you use to select the FLA file and a large Status area that shows any errors and warnings encountered whilst converting your file. There are also Preferences that can be set which at the moment include automatically opening the converted file in your default browser and enabling logging.

After selecting my FLA file I clicked ‘Convert’ and after a few seconds it happily converted my file without any errors or warnings. I was interested to see what it would make of the shape tweened animation that I had made in Flash as this seemed like quite a challenge to convert.

Wallaby’s HTML output:

The conversion process results in quite a few files being exported as the animation is recreated using a combination of HTML, CSS and jQuery / JavaScript, here a screenshot of the files I got:

It turns out that to make an HTML version of our tweened FLA animation we need two JavaScript files, one CSS file, one HTML file and a folder containing 246 SVG frames. I think it’s fair to say that tweened animations are a bit of a challenge! Looking at the Release Notes it does state that it creates an SVG file for each frame of a shape tween. As a result the approximate sizes of the exported files look like this:

  • Energy-html.html: 75kb
  • Energy.css: 49kb
  • Energy.js: 2kb
  • jquery-1.4.2.js: 70kb
  • Energy_assets: 176kb

That’s a total of 372kb in order to recreate the tweened animation in HTML / CSS / JS / SVG.

Flash’s output:

So how does this compare with the files output by Flash? Well, not very well when you look at both the amount of files required and their file size:

  • Energy-flash.html: 2kb
  • Energy.swf: 4kb

A grand total of 6kb when it’s rendered as Flash swf output. That’s quite a difference in size, although admittedly Flash’s default HTML file doesn’t use any JavaScript such as SWFObject to embed the file which is generally common practice, so I would argue that the Flash output should really have the following additional SWFObject files added to it’s output:

  • swfobject.js: 10kb
  • expressInstall.swf: 1kb

Even with that it’s still only a total of 17kb, about 1/20th the total size of the files that Wallaby outputs.

Why not use video instead?

This particular animation is obviously a challenge for Wallaby to convert into anything closely competing in file size, so perhaps in this case it would be a better option to use a video clip to provide a non-Flash version of the animation? The same animation can be output as an H.264 video at the same frame rate and it comes in considerably smaller at 192kb:

  • Energy-video.html: 1kb
  • Energy.mov: 192kb

That’s about half the size of the Wallaby output. It’s also likely to playback much better on mobile devices such as iPhone / iPad / iPod touch as Adobe warn that the output of shape tweened animation can result in playback performance difficulties on iOS devices.

In the interest of being consistent I should of course add some additional JavaScript video embedding option such as JWPlayer as again this is common practice when it comes to adding video on web pages. Using JWPlayer would add the following additional files:

  • jwplayer.js: 104kb
  • player.swf: 96kb
  • yt.swf: 1kb

This adds 201kb to the total file size required so that brings it to about 393kb, almost the same as Wallaby’s HTML5 output. I’m sure there are possibly slimmer options compared to JWPlayer for embedding that could reduce that down a bit but I reference JWPlayer as I consider it to be one of the best cross-platform methods of embedding video on website.

Of course I haven’t mentioned anything about delivering video in alternative codecs such as Ogg Theora or WebM to serve browsers like Opera, Firefox and Chrome, this would further increase the files and sizes involved. However, given that Wallaby is trying to provide a way for animated content created in Flash to be output and made playable on devices such as the iPad and iPhone it could also be considered that providing any kind of Flash fallback for video is unnecessary, so we could ignore the JWPlayer / JS completely and just use the regular HTML5 <video> tag and a single H.264 video, so we’d be back to a considerably smaller size than Wallaby’s output in this instance.

In Closing…

Overall Wallaby looks to be a pretty handy tool for people that are struggling to find a way to get their content viewable on the millions of devices that don’t (and likely never will) run Flash. For animation that doesn’t involve shape tweening I think the resulting file sizes will be a lot more favourable and it will be a reasonably efficient way to create animated content using standards-based technology.

I would say that the primary target user for Wallaby, at least at first, is for people doing online advertising. This is an area that is seeing a lot of activity with tools like Sencha Touch appearing, there’s definitely opportunity for a ‘killer app’ that makes creating banner advertising using all of these new emerging web technologies in a way that people are used to doing in the Flash IDE. There’s a lot of challenges in there technically, as well as a lot of issues such as accessibility, graceful degradation etc, but I think a lot of companies are focusing on this challenge now, so it’s good to see that Adobe is thinking about various ways of providing tools for the job.

Sample Animation files:

I have included the HTML5 output as an iframe and also links to each animation in HTML5, SWF and H.264 formats. There is also a link to download all the assets of Wallaby’s output in a zip archive:

Wallaby’s HTML5 output:

View Wallaby’s HTML5 Output in new Window
Download the Wallaby HTML5 Assets as a Zip

SWF output:
View SWF Output in new Window

Video output:
View Video Output in new Window

VoltaicHD 2.0: Edit, Convert, Upload AVCHD / AVCHD Lite video

Shedworx‘ essential AVCHD tool VoltaicHD just took a healthy step forward in functionality with the recent release of VoltaicHD 2. Highlights of the new features in the update are the ability to preview AVCHD / AVCHD Lite clips, the ability to edit native AVCHD video and the ability to upload video to YouTube and output to presets like iPhone, iPod etc.

Native AVCHD / AVCHD Lite editing

Version 2 of VoltaicHD increases the scope of the app from simply being a tool to convert AVCHD format video footage to now include basic native editing of AVCHD footage. You can now preview AVCHD / AVCHD Lite video clips within the application and then set simple in and out points to define a section of raw AVCHD / AVCHD Lite video clips which can then be trimmed down and converted.

The interface is reminiscent of the new Quicktime X player that comes with the Mac OSX 10.6 Snow Leopard. It’s very easy to set the position of the start and end as the position in seconds within the movie is shown in a little tooltip when sliding them around. My own criticism with it is the same that I have with the new Quicktime X player in that you can’t use the mouse to fine-tune the position of these points. That’s a little thing that I miss from the old Quicktime Pro player’s editing capabilities.

Overall though the ability to trim down clips before conversion is a massive timesaver, instead of having to convert a whole chunk of AVCHD footage you can just roughly trim down to the section you want and then convert only the bit you want. Definitely a great and helpful improvement!

Upload to YouTube, output to iPhone / iPod and AppleTV.

Along with the new preview / edit capability there is also the ability to convert and upload trimmed clips to YouTube directly from VoltaicHD 2.0. There are also preset output options for iPhone / iPod and AppleTV.

One of the things you’ll notice when you run VoltaicHD 2 compared to the previous version is the new drop-down menu options at the top of the window. From this menu you can select the specific option that you want, then combine that with a little bit of previewing and trimming in the clip details panel and you’re ready to go.

Both of these options make it really easy to get footage off your camera and online or onto your devices, take it with you or watch it on TV. Shedworx make another application called RevolverHD which enables you to create AVCHD DVDs that will work on most blu-ray players, however I find the ease of exporting video that’s ready to go onto my iPhone really convenient. As blu-ray players become more commonplace then I think I’ll use RevolverHD much more as a perfect way to send HD footage to my extended family.

Worth the upgrade cost for existing users?

It’s most definitely an upgrade I’d recommend for any VoltaicHD user, it’s worth noting that this is the first paid upgrade to Voltaic since it was released in July 2007. Don’t forget that VoltaicHD is available for both Mac OSX and Windows operating systems too.

If you’re an existing VoltaicHD user then you can upgrade to version 2 for only $9.99, or if you happen to have bought the previous version since July 2009 then you’re eligible for a free upgrade to VoltaicHD 2.0. First-time customers can buy VoltaicHD 2.0 for $39.99 which is still a great deal for a great bit of software!