New Mozilla Encoder Improves JPEG Compression 155
jlp2097 writes "As reported by Heise, Mozilla has introduced a new JPEG encoder (German [Google-translated to English]) called mozjpeg. Mozjpeg promises to be a 'production-quality JPEG encoder that improves compression while maintaining compatibility with the vast majority of deployed decoders.' The Mozilla Research blog states that Mozjpeg is based on libjpeg-turbo with functionality added from jpgcrush. They claim an average of 2-6% of additional compression for files encoded with libjpeg and 10% additional compression for a sample of 1500 jpegs from Wikipedia — while maintaining the same image quality."
Why aren't we using PNG? (Score:1)
Re:Why aren't we using PNG? (Score:5, Informative)
Re: (Score:2)
I wish I had mod points. Thank you.
Re: (Score:3)
Yes, and he's pointing out that the exact thing the parent said as an advantage for PNG is also its disadvantage.
A lossless compression format cannot compete with a good lossy compression format in terms of file sizes for arbitrary content, even though it wins by definition in terms of fidelity. The web, even today, is very bandwidth constrained and thus file sizes are one of the most important things to optimize against, for both the client and the server. Fidelity is often not a very important considera
Re: (Score:2)
Re: (Score:3)
But why have the many successors to jpg that provide better lossy compression not caught on?
Without getting into a full-blown Doctoral Thesis, it's usually because either they suffer performance issues, or don't do nearly as good a job of preserving the visual integrity of the source image. JPG is a good balance of speed, quality preservation, and size of the compressed file.
Nah, I think it's mostly because JPEG is good enough. JPEG2000, for example, also provides perfectly acceptable performance and quality, with significantly-reduced file sizes. But unlike JPEG, JPEG2000 decoders aren't already available everywhere. The slightly-reduced file size isn't sufficient justification for the risk that some users might not be able to see the photo. An improved JPEG encoder helps (a little) with file size without incurring the need for a new decoder, so it's immediately useful.
Re: (Score:3)
I think it's mostly because JPEG is good enough. JPEG2000, for example, also provides perfectly acceptable performance and quality, with significantly-reduced file sizes. But unlike JPEG, JPEG2000 decoders aren't already available everywhere.
It's the fax-machine effect, JPEG will be around forever because everything, and I mean everything, that creates, processes, manipulates, and displays images, speaks JPEG. If Jobs was still alive and decided that from now on iWhatever's were only going to do JPEG2000 (and it's not just for file size reasons, image quality is also vastly improved [fnordware.com]), you can bet that we'd have a surge in JPEG2000 adoption as soon as the first JPEG2000-only iWhatever was released.
(Personally I'd opt for JPEG-XR, which is more
Re: (Score:3)
JPEG XR is actually quite good and is now an open standard. I recently did an extensive evaluation of JPEG 2000 vs. JPEG XR. While JPEG 2000 has slightly better compression quality (less visible artifacts) at the same file sizes it’s decode performance is substantially slower than JPEG XR (the same is true for encode performance, but decode is much more important). In my testing, one of the fastest JPEG 2000 libraries, Kakadu, is anywhere from 1.8 to 2x slower than JPEG XR at decoding files. Kakadu is
Re: (Score:2)
While JPEG 2000 has slightly better compression quality (less visible artifacts) at the same file sizes itâ(TM)s decode performance is substantially slower than JPEG XR (the same is true for encode performance, but decode is much more important).
How much of this is due to hardware support for core JPEG operations in GPUs and (to a lesser extent) CPUs? If wavelet-based JPEG took off, would it just be a matter of time before hardware vendors added explicit support for it to their instruction sets, at which point the speed difference would vanish?
Re:Why aren't we using PNG? (Score:5, Interesting)
If you're talking about simple web graphics, then yes, PNG is often a good choice. Lossy compression simply makes more sense for photos, as the compression ratio is that much better. Always using PNG is idiotic, as is always using JPEG. JPEG2000 is not our saviour.
Re:Why aren't we using PNG? (Score:5, Insightful)
It's a shame JPEG2000 debuted dead on arrival thanks to patent encumbrances. Creation of a superior open lossy image compression standard seems to have been left behind in favor of video. We have PNG and Theora, but nothing free that improves on jpeg.
Re: (Score:1)
I don't think patents are the problem. I would say it's more that jpeg2000 is slow as molasses, is trying to be lossless and lossy at the same time and failing at both - lossless is way larger than png, lossy throws away the advantage of downsampled YCbCr colorspace that jpeg has. It's not clearly superior to preexisting stuff, except for people with strange needs. Who in fact are using it. It just never went mainstream.
JP2 is used, just not on the web. (Score:2)
Yeah, lots of universities use it for a lot of things, like scientific and cultural heritage images... they serve the images up, if need be, through the proprietary lurawave image server... not a great solution from a systems perspective, but it's what they like.
Personally, I think the lack of widespread adoption makes it a serious preservation concern.
Re: (Score:2)
So, JPEG2000 is dead and WebP not alive yet.
Re: (Score:2)
Re: (Score:2)
JPEG is good enough that there is little motivation to build browser detection to serve up different formats to different browsers. So unless MS decides to support webp I don't see it taking off.
Exactly (Score:1)
PNG 8 to replace GIFs
PNG 24 to replace JPEGs
Re: (Score:2)
Yeah, I also love those stitched panoramas with a few GiB file size, really a great idea ;)
png is a great format of course and I use it a lot but such an generalization doesn't make any sense at all. Depending on the use case you have to decide which one you use.
Re: Exactly (Score:2)
1) PNG8 can support full alpha transparency.
2) PNG is better with fewer colours And blocks of one shade, as it compresses by merging close shades. JPEG is better to compress With lots of different colours like photos as it merges neighbouring pixels.
Re: (Score:2)
Note that in paletted png files transparency is always handled by specifying alpha values for palette entries. Even if you are only doing simple binary transparencys so on a format level there is no real difference, it's just that some old browsers can't handle partially transparent palette values correctly.
Re: (Score:2)
It can mean
8-bit indexed PNG with binary transparency, or it can mean 8-bit indexed PNG with
full transparency.
IIRC in the format itself there isn't really a distinction, PNG transparency for indexed images is always handled by specifying alpha values for palette entries. It's just old versions of IE fail to handle them correctly.
Re: (Score:1)
Often the simplest solution doesn't cover the real world. Even with pngquant, a large photograph will be three times as large or more in PNG24 than JPEG. 300KB vs 100KB is nothing to sneeze at when you're on a mobile device and on the edge of cell coverage.
Not to mention the server bandwidth usage when the bill comes due.
Re:Exactly (Score:5, Informative)
General image software support is poor for both.
Re: (Score:2)
Re:Why aren't we using PNG? (Score:5, Informative)
Re: (Score:2)
Re: (Score:2)
Now that's not to say you should use JPEG absolutely everywhere. Stuff like computer generated images with 1 pixel lines or text loo
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Why aren't we using PNG? (Score:4, Interesting)
Since I started looking at web pages with JPEG images, the speed of my internet connection has increased by roughly 345,000%, the size of my hard disk by 200,000%. Why is a 300% increase in image size a concern?
Re: (Score:3, Insightful)
Re: (Score:2)
Since I started looking at web pages with JPEG images, my internet connection has almost doubled in speed and now there are pages that are basically unviewable.
Many locations don't have any other option then dial up and many more have various caps on the amount you're allowed to download before charges increase.
Re: (Score:2)
There are services (Opera's work quite well, Google has one too) that will re-compress any images to lower quality lossy formats and into a single response to avoid round-trips. I don't think big image files are really the main problem for people still on dialup.
no SAT (Score:2)
Many locations don't have any other option then dial up
Are they all on a north face of a mountain? Because if there's a tall building in the way, the area is probably urban enough to get DSL or DOCSIS, and if there isn't a mountain in the way, it can more than likely get satellite.
Re: (Score:2)
Because you are not only one in this world, you are probably in the 0.01 percentile of fast connections.
A lot of people like me have slow connections, so I *hate* big images, and particularly sites which put images everywhere.
I'm in a rural region, so I don't expect a faster connection anytime soon. And moving is not an option either.
There is a proverb that says that a picture is worth a thousand words, but it's not true with the Internet.
Re: (Score:2)
There are plenty of services like Opera Turbo that will recompress all images as smaller lossy images. Why should all users get a degraded experience when those on slow connections have options to automatically recompress images to be better suited for their connections?
Re: (Score:2)
Because the sites using lots of images tend to also use lots of Javascript.
I don't see why I should upgrade my computer to be able to browser the web.
The worst sites I browsed tend to put pictures everywhere, even (and especially) when unnecessary.
And don't make me laugh with the "experience" of the web.
Why do you think the simple Google search won over the "portal" search of Yahoo ?
Has simplicity become so irrelevant ?
Re: (Score:2)
I don't see what javascript and computer upgrades have to do with PNG. Heck, the point of Opera's stuff was to run on lower-end devices. It was intended for use on mobile originally. And it still supports javascript.
Re: (Score:2)
It would help if many paint programs had a default JPEG quality level higher than 70%. That's okay when dealing with 10 megapixel photos that are only shown on the screen or destined for 4x6 prints. Not so nice when dealing with large prints or artwork. I hate seeing Death by JPEG on sites like DeviantArt all the time.
Re:Why aren't we using PNG? (Score:4, Insightful)
PNG is great for everything but actual photos, and should be used for just that: everything but photos. But photos really do need the extra boost from lossy compression.
Re: (Score:2)
Images in Firefox (Score:1)
Now if Firefox could just scale images properly when viewing them.....
Seem Negligible (Score:1)
Seems like a negligible improvement. I mean really. With hard drive space plentiful, and bandwidth faster than most users can use at any given moment, saving 20-60Kb on a 1Mb file is like a fart in the wind, even for mobile users.
I'm with the AC in the first post, I use PNG for 90% of my images, since it supports transparency. The file may be slightly bigger, but who cares.
Re:Seem Negligible (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
I found switching large photographs on my site from png to jpeg led to a noticeable loadtime increase. It's not a lot, but it is noticeable. However, I'm sticking to PNG for any non-photographic images.
Re:Seem Negligible (Score:4)
I found switching large photographs on my site from png to jpeg led to a noticeable loadtime increase.
Decrease?
Re: (Score:3)
*facepalm* Yes, that would be the word I was looking for. They had me answering the Helpdesk phone today, so my brain is a little fried from too much user interaction.
Re:Seem Negligible (Score:5, Insightful)
If 2-6% is nothing, why not donate that percentage of your monthly salary to a good cause?
Re: (Score:2)
I make regular contributions to charitable organizations on a regular basis. It gets deducted from my pay cheque every two weeks :)
Re: (Score:1)
If 2-6% is nothing, why not donate that percentage of your monthly salary to a good cause?
Yes, please invest in my new bitcoin exchange. I'm calling it Mt. DevNull. Catchy!
Incremental improvements in compression are all you are going to get these days. The field is pretty mature, so 2-6% is exciting. Well, to compression geeks.
Re: (Score:2)
Re: (Score:3)
A few KB saved by an end user on a high speed connection isn't much, but...
A few KB multiplied by millions of users accessing a single site soon adds up.
And it's also of benefit to those on slow or metered connections.
Re:Seem Negligible (Score:5, Insightful)
Anyone with a metered internet connection. Which is a depressingly large set of people, and signs are that it's going to get larger.
Re: (Score:2)
Or anyone who serves gigabytes of content per hour and possibly terabytes per day -- google, facebook, wikipedia, imgur, etc.
Re: (Score:2)
Seems like a negligible improvement. I mean really. With hard drive space plentiful, and bandwidth faster than most users can use at any given moment, saving 20-60Kb on a 1Mb file is like a fart in the wind, even for mobile users.
I'm with the AC in the first post, I use PNG for 90% of my images, since it supports transparency. The file may be slightly bigger, but who cares.
Seems like a negligible improvement. I mean really. With hard drive space plentiful, and bandwidth faster than most users can use at any given moment, saving 20-60Kb on a 1Mb file is like a fart in the wind, even for mobile users.
I'm with the AC in the first post, I use PNG for 90% of my images, since it supports transparency. The file may be slightly bigger, but who cares.
Slightly better? For full color photographs, PNG is *much* bigger. Anyone that's serving up a lot of images to users cares because of bandwidth and storage costs.
I picked a random Wikipedia image:
https://upload.wikimedia.org/w... [wikimedia.org]
The 1200x900 JPG is around 300KB. I converted to PNG with Gimp, and the resulting file was 1.7MB - almost 6 times larger. The Filesize after converting with Imagemagick was about the same.
For busy websites, an improvement of 2-6% better jpeg compression can save significant money w
Re:Seem Negligible (Score:5, Informative)
Slightly better? For full color photographs, PNG is *much* bigger. Anyone that's serving up a lot of images to users cares because of bandwidth and storage costs.
I picked a random Wikipedia image:
https://upload.wikimedia.org/w... [wikimedia.org]
The 1200x900 JPG is around 300KB. I converted to PNG with Gimp, and the resulting file was 1.7MB - almost 6 times larger. The Filesize after converting with Imagemagick was about the same.
For completeness, I took a 94MB full color 6496x4872 TIFF and converted it to PNG (compressionlevel=9) and got a 64MB file. Then compressed the same TIFF to JPG (Quality=90), and got a 7MB file.
Re: (Score:2)
Has anyone actually tried their code to see how effective it is? I don't have a system to compile it on at the moment.
Re: (Score:2)
Has anyone actually tried their code to see how effective it is? I don't have a system to compile it on at the moment.
Seems to work as advertised, if you don't care how long it takes to convert an image.
I compiled their source and ran their cjpeg against /usr/bin/cjpeg already installed on my system, and it did create jpegs that are 6 - 10% smaller in filesize with the same apparent image quality (I just zoomed in and eyeballed them side by side, I didn't do any extensive analysis).
However, at a quality level of 75, the Mozilla code took 10 times longer to run, while at a quality level of 90, the Mozilla code took nea
Re: (Score:2)
Thumbnails can sometimes be pushed as low as 75-80 quality, but you start to notice the JPEG artifacts.
Re: (Score:2)
You start to care once you multiply those 2% across millions of users. Any savings at such basic level are multiplied by how often the resource is used. So no you don't care about this for your CRUD web application, but wikipedia saving 2% bandwidth translates in one less datacenter required which means thousands of dollars.
Re:Seem Negligible (Score:5, Insightful)
Seems like a negligible improvement..
Yes WebP would be a better choice
Webp is amazing (Score:5, Informative)
Agreed, it's a much better choice. I actually converted my entire image library to .webp, and I use Irfanview to view the images. The filesize savings were huge, with no visible reduction in quality.
Some examples:
4.5 MB JPG -> 109 KB webp
3.66 MB JPG -> 272 KB webp
3.36 MB JPG -> 371 KB webp
One folder of mixed JPGs and PNGs with a total of 169 MBs was converted to webp. the total size of all contents of the folder ("directory", whatever you want to call it) was 6.44 MBs. I was so impressed that I kept records of the results.
Not only would this be HUGE for sites like Wikipedia, but it also significantly decreased the amount of space that I was using in my cloud storage account.
Honestly for all of their PR about a better, more open web, all we really get is the same old politics and attempts at controlling what is and is not the standards. They still behave like children. Mozilla, Google, I'm not taking sides. They're both at fault.
Re: (Score:2)
Agreed, it's a much better choice. I actually converted my entire image library to .webp, and I use Irfanview to view the images. The filesize savings were huge, with no visible reduction in quality.
Some examples: 4.5 MB JPG -> 109 KB webp 3.66 MB JPG -> 272 KB webp 3.36 MB JPG -> 371 KB webp
It would help to know mor about your experiment. I can get quite big size improvements here by recompressing my camera's (Canon EOS) Jpeg files to... Jpeg! And with no visible quality difference either. They go from 6.7MB for the Canon file, to 3.1MB for quality 90 in imagemagick, 1.7MB for 75 and 1.4MB for 65. And ni your experiment the WebP quality scale may not exactly match the Jpeg one which makes comparisons even harder.
Re: (Score:2)
One of the problems with JPEG is that it works (caveat about luma/chroma subsampling) on 8x8 pixel blocks. This is great for medium-resolution images (e.g. 72-200dpi range), but not so great for high-resolution (e.g. 1200dpi).
I grabbed a 3032x1986 image [wordpress.com] (warning: large image), and here's what I got.
PNG, compression level 9: 6.1MB
JPEG 4:4:4 100: 2.7MB
JPEG 4:4:4 95: 1.2MB
JPEG 4:0:0 95: 0.9MB
WEBP 100: 1.5MB
WEBP 95: 0.5MB
Note that I have no experience with all of the WebP options. I just used -m 6 -q quality in
Re: (Score:2)
..... a new format that doesn't seem like it will ever be feature-complete.
What features do you see WebP lacking. It uses the RIFF container format that allows XMP metadata, which itself can include EXIF data. It includes lossless and lossy modes, animation [gstatic.com] and alpha channel (transparency) [google.com]. What do you think is missing?
Re: (Score:2)
What features do you see WebP lacking
Ability to be displayed on most browsers?
Re: (Score:2)
What features do you see WebP lacking
Ability to be displayed on most browsers?
Not really a limitation of WebP any more than "ability to be owned by most adults" would be a limitation of a Ferrari
Re: (Score:2)
So the price wouldn't be a limitation of the Ferrari? Well, it's up to you to ignore what most would consider the first limitation of the object... but still...
If WebP was available in all browsers like GIF, PNG and JPG, I would've had a deeper look into it. It's not. I couldn't care less about this file format. What use could I possibly find for it? What purpose does it serve for anyone? If you need lossy, go JPG. If you need lossless, go PNG. WebP doesn't fit any scenario...
Re: (Score:2)
This is not for the benefit of the users but for webmasters.
If you have a site with any decent amount of traffic, you pay for bandwidth and you have a content delivery network. 10% smaller images translates into 10% savings.
Moreover, Google takes site speed into account when ranking sites.
Re: (Score:1)
Seems like a negligible improvement. I mean really. With hard drive space plentiful, and bandwidth faster than most users can use at any given moment, saving 20-60Kb on a 1Mb file is like a fart in the wind, even for mobile users.
It would not be worth the effort for one website or even ten. But what is proposed is an improvement to the most commonly used JPEG implementation in the world. The cost will be amortized over millions of websites as software is upgraded over the next few years.
To see how this works, let's make up some numbers. Lets say that the whole effort will consume $100,000 worth of labor. Let's guess that within five years it will be installed on one million websites. That means it will cost $0.10 per website. Is it
Re: (Score:2)
But it isn't 10%, its 2-6% :)
But I see the point, with large numbers of files served, it can add up.
Re: (Score:2)
Serving static files is pretty cheap nowadays.
The main reason to keep image filesizes down on the web is to make life easier for those end users who are stuck on crappy dialup or cellular connections.
Re: (Score:1)
Thus reducing the cost of the content for the end user, especially for those who are stuck on metered bandwidth.
Such reading comprehension failure is inexcusable. Clearly you also do no have an understanding of end-to-end costs.
Re: (Score:2)
Re: (Score:2)
If your bandwidth as a provider of content is costing you 400K and you can reduce it by 1% by just using a new image standards, that's a nice 4K saving per year just on service. Now you have to decide if saving 4K will cost you more than 4K. In some cases it's not about saving money but rather avoiding the need to upgrade hardware and infrastructures.
Re: (Score:3)
jpegtran is a good util for shaving 5-10% off most jpegs out there.
Something to watch for with jpeg, "arithmetic coding" reduces your filesize compared to "huffman coding" but it also reduces compatibility. It caused me a fair bit of head scratching trying to work out why pdflatex wouldn't accept the jpegs that came out of jpegcrop (which started using arithmetic coding by default).
Re: (Score:2)
Needed for Digital Cameras (Score:2)
My digital camera has horrible compression. I can load and save the pictures with pretty much any application, and the size of the files is reduced significantly without any noticeable image quality reduction. (And yes, I am saving it in the original size.) Maybe it's just my old Sony camera, but it's likely a common issue--I expect embedded compression in consumer devices worries more about simple and fast than best quality for the file size.
Re: (Score:2)
Do you know that the quality isn't being reduced? An image manipulation program like the GIMP may not make it too clear that it's redoing the lossy part and further reducing quality even if asked to save at the same quality,
jpegtran is a command line tool that can recompress a jpeg image without changing the quality. If the original compression was poorly done, jpegtran will shrink the file. If jpegtran can shrink your camera's photos, then you know your old camera does a hasty job on the compression.
Re: (Score:2)
Does the loading and saving keep EXIF? My camera puts in a lot of EXIF information, stripping it out can save quite a bit of space.
Compatible with all except what you want to use, (Score:2, Interesting)
is what I get from "compatible with the vast majority of decoders".
Sounds like it breaks something.
I wish they would focus on WebP instead (Score:5, Interesting)
JPEG XR (Score:1)
The resistance [mozilla.org] to support WebP [google.com] in Mozilla seems to be more politically motivated than technical.
Why not add JPEG-XR as well?
https://en.wikipedia.org/wiki/JPEG_XR
Re: (Score:2)
The resistance [mozilla.org] to support WebP [google.com] in Mozilla seems to be more politically motivated than technical.
Why not add JPEG-XR as well?
https://en.wikipedia.org/wiki/JPEG_XR
"JPEG XR[3] (abbr. for JPEG extended range[4]) is a still-image compression standard and file format for continuous tone photographic images, based on technology originally developed and patented by Microsoft..."
Keyword in bold. Still, a very nice format.
Re: (Score:2)
Yes as of last April only:
"In April 2013, Microsoft released an open source JPEG XR library under the BSD licence.[41][42] ... the previously released "HD Photo Device Porting Kit"[43] was incompatible with the GNU GPL."
Re: (Score:2)
That a software license covering a reference software implementation that Microsoft provided, not a patent license. They've made the patents freely available to implementers since 2007 as part of their Microsoft Open Specification Promise:
Microsoft has patents on the technology in JPEG XR. A Microsoft representative stated in a January 2007 interview that in order to encourage the adoption and use of HD Photo, the specification is made available under the Microsoft Open Specification Promise, which asserts that Microsoft allows implementation of the specification for free, and will not file suits on the patented technology for its implementation,[39] as reportedly stated by Josh Weisberg, director of Microsoft's Rich Media Group. As of 15 August 2010, Microsoft made the resulting JPEG XR standard available under its Community Promise.
Re: (Score:3, Informative)
The resistance [mozilla.org] to support WebP [google.com] in Mozilla seems to be more politically motivated than technical.
AMEN!!!! WebP is modern. JPEG, GIF and PNG are all older than most pop stars. Why do we use the image compression equivalent of MPEG1 still?
Seriously, this is so dumb. I continue using Firefox for two specific reasons (tagged bookmarks and Pentadactyl) but Vimperator and Pocket are making Chrome more tempting. I choose WebP (using the official encoder I build directly from Google's repository) for my online photo storage. Decades of photos and scans I would estimate occupy about 1/8th the space of JPE
JPEG is good enough (Score:2)
0) JPEG is past, present, and near future. Well supported everywhere.
1) JPEG optimization could be better. Mozilla is doing more of that.
2) Patents on enhancements to JPEG from minor obvious ones to significant compatibility breakers prohibit improvements. JPEG's final compression step was poor from the beginning and the better stuff was patented and unused. At least a decade ago StuffIt used modern binary compression to replace the final phase, which was exempt from existing patents; however, StuffIt pate
Re: (Score:2)
Unfortunately WebP isn't all that good for a next-gen format.
http://people.mozilla.org/~jos... [mozilla.org]
Re: (Score:2)
Can someone implement support as a plugin? Or is that non-trivial?
MJPEG (Score:2)
Use JPEG2000 instead (Score:2)
I would rather see JPEG 2000 support.
Re: (Score:3)
Re: (Score:2)
I thought the world moved on from JPEG a long time ago.
Either you are trolling or you operate in some strange niche that is isolated from the rest of the world
Back in the real world jpeg is by far the dominant format for lossy compression of photographic images. The wide compatiblity of jpeg has outweighed the advantages of more modern formats.
Still patented?
There have been a number of patent claims against baseline jpeg over the years but afaict none of them have really stuck.
There was also previously a patent on the optional arithmetic coding feature but that has since expi
Re: (Score:2)
JPEG is the MP3 of images. Goo enough and so ubiquitous that nobody even tries to compete anymore.
Cellular (Score:2)
Re: (Score:2)
These days people can use some wifi instead of paying for their own net connection. The resulting speed is highly situational but can be e.g. 70 to 90 KB/s downloads or less.
Re: (Score:2)
I didn't want to get into details, but over here semi-private hotspots are popular and pretty ubiquitous (a significant subset of ISP provided routers double as a secure hotspot that doesn't interfere with the main user's home network). You them need codes related to the ISP at hand (some login/password). The use cases are if your DSL/broadband is down, or if your are on the move, or for "unofficial" use.
So that works in urban areas actually. Wifi isn't necessarily great if you connect to some router in the