South Park – Stick of Truth

Due to the amount of popularity attracted by this game, I decided to give it a try, and finished the game a while ago.

It’s probably been written about heaps of times by others, so I won’t dwell on much other than my own thoughts.

Overall, it’s surprisingly decent for a game based on a TV show.  In short, South Park fans should probably love it, otherwise it should still be a decent game.  I fall somewhat in the latter category, having only really seen a few episodes of the TV show.

The game’s highlight is perhaps it getting the general South Park feel of humor, without ruining much of the play experience.  As these two aims (humor and game play) are somewhat at odds with each other at times, which means curbing each of them back to achieve a decent balance (for example, you can parody RPGs, but not so much if you want to be a somewhat serious RPG).  In my opinion, the creators have done a reasonable job at this, though this does mean that there is a bit of a compromise here.

Whilst the humor aspect is somewhat unique (at least to me), the actual game play side of things only seemed mediocre.  Nothing particularly stood out to me above a typical RPG – perhaps the ability to use an item without using up your turn was an interesting mechanic, but that’s about it.  Whilst this may actually be intentional (to make it easier to poke fun at RPGs), it certainly is a weak point.

In fact, the combat, in particular, felt rather gimmicky.

  • Do we really have to mash a button every time to use an ability?  I don’t particularly mind the button mashes elsewhere in the game, as they don’t occur often, but it can get annoying pretty quick in combat.
  • There doesn’t seem to be a whole lot of consistency between how you activate an ability.  Sometimes you left click, other times you need to right click, press a button, mash a key, mash a combination of keys or play some other game.  Remembering seems like an unnecessary chore, though fortunately it’s mentioned what you need to do when you’re about to use an ability.  But seriously, is this even really necessary?
  • And the whole ‘click when you see a *’ thing seems… well… I don’t mind it so much, but it feels like a cheap trick to try to keep the player engaged and alert, rather than sitting back and issuing commands without much thought.  It kinda reminds me of Final Fantasy 8′s summon Boost mechanic
    • If you don’t know what it’s like, basically summons had rather long animations – around 30 to 80 seconds in length (yes, an 80 second animation every time you activated an ability; well, I suppose it could be better worse – imagine if this played (duration-wise, not content-wise) every time you went into battle…).  So to keep the player engaged whilst they watch the same animation for the 50th time, you have the option to boost the power of your attack by repeatedly pressing a key when indicated on screen.  SoT’s click timing (as well as some of the games) feels somewhat similar

The balance/difficulty seems like something that could be improved too:

  • Stats seem to vary drastically between levels – by level 15, I had nearly 9000 health, compared with the ~100 or so (whatever the amount was) you start off with at level 1
  • I found the game generally got easier later on, possibly due to the above issue
  • …or maybe that you can remove 75% of an enemy’s armor, in just two turns using an ability you get halfway through the game…
  • There were times where all my attacks did 1 damage to the enemy, however I was still able to win just by relying on status ailments.  Actually, perhaps this isn’t so bad…

 

To re-iterate, a relatively unique experience (at least for me) in a game.  Comedy aspects entertained me, though it was a little subdued, game play was passable and the combat system was perhaps the weak point which felt a little gimmicky.  Overall a decent game that’s probably worthwhile just for the experience.

Centre screen subtitles?

Subtitles, for video content, are pretty much always aligned to the bottom-centre of the video frame.  There, it doesn’t interfere much with the visual content, but I can’t think of any other particular reason why it should be there (perhaps there’s some history I’m missing out on).  Top-centre alignment is rare (although it seems to be as feasible as bottom-centre) – often only used for translating signs or placing lines of a secondary speaker.

However, a problem I’ve noticed with all the anime that I’ve watched, is that this positioning really draws your focus towards the bottom of the screen, especially if you’re a slow reader like I am.  It means that I have to rely more on my peripheral vision to see the actual content, or, as I quite often do, frequently pause the video to give me enough time to look up and see what’s going on.  This is perhaps one of the key reasons why I prefer dubs over subs.

And, if anything happens to be aligned top-centre, such as translation notes or secondary speaker lines, it’s much easier to miss if your attention is at the bottom of the video frame. Though this could be easily solved by placing all subtitles at the bottom of the screen and using styling to differentiate the lines.

So What?

A more radical idea just came to me though: what if subtitles were centred on the screen?  This could make things easier to read by keeping you focused on the content.  Semi-transparency could be used to mitigate most the downsides of the text obscuring content, and it’s not hard to do.  ASS, the standard subtitle format used these days for anime fansubs, already supports both these features (and a lot more), unlike many other formats such as DVD subtitles, which don’t provide this flexibility and may have made this idea less practical.

Here’s a list of pros & cons of centre aligning subtitles that I could come up with:

Pros:

  • Keeps your focus towards the centre of the screen
  • Makes it easier for slow readers to read subtitles whilst watching what’s actually going on in the video
  • Generally an easy change to make (see below for a quick guide)
  • Subtitle authors have the option of embedding two subtitle tracks or supplying an alternatively styled subtitle file, which means that nothing changes for the viewer unless they want to experiment with centre aligned subtitles

Cons:

  • May be more distracting
  • May obscure content unless specifically authored to not do so, either through positional or transparency adjustments
  • Adjusting positioning to avoid the above could mean that the viewer has to look around for subtitles at times, though, since the the idea is to draw attention towards the centre of the screen, this probably isn’t much of an issue
  • Semi-transparency could make text harder to read
  • People aren’t used to it, and hence seems weird
  • May require subtitle format to support alignment and semi-transparency settings, though the user usually has the option to specify these for simpler formats like SubRip
  • This probably isn’t applicable everywhere – I’m only considering anime fansubs here

Test Run

Well, it’s easy to do, so why not test it?  After doing a quick test, I did find that I could actually read the subtitles without pausing the video like I usually do.  It did feel weird though, but I’d imagine that one could get used to it.  Here’s a before and after screenie:

Subtitle alignment comparison

Top: default styling, bottom: centre aligned with semi-transparency

The following would probably need some specific attention though, as aligning to the centre doesn’t seem the most appropriate here.  (Note that the translation isn’t for the text on screen, rather it’s a translation of what’s being spoken)

Subtitle alignment comparison 2

Conclusion

I don’t know whether anyone else has thought and tried this before – a very quick web search doesn’t seem to turn up anything, and I’ve certainly never heard of anyone looking into this idea.

So if you read this and am interested, I certainly would love to hear your thoughts and experiences.  Personally, this idea seems to be worthy of consideration, and I’d like to try it out more.  Or perhaps not everyone has the same issues as I do…

How to Test it Yourself

As my interest here is for anime, I’m only going to provide a rough guide on how to modify a fansubbed video for centre screen subtitles.  I’m also going to assume that it is an MKV file with ASS subtitles (and alignments aren’t forced every line etc etc), which is what most fansubbers distribute in.

  1. First, you need to extract the subtitle stream – you can use MKVExtractGUI for this (or just the mkvextract CLI utility if you computer is anti-Microsoft)
  2. Open the extracted ASS file with Aegisub
  3. Visit the Subtitles menu and the Styles Manager option
  4. On the right, there’s a big list box of all the defined styles, and you’ll need to identify the main one(s) (usually Default) and Edit them.  This could be difficult depending on how the styles are set up and may require you to edit multiple styles
  5. In the Style Editor dialog (that pops up after clicking the Edit button), add semi-transparency by specifying opacity in the Colors frame (it’s the textboxes under the colour buttons).  Values range from 0 (opaque) to 255 (transparent) – 96 is a good value to start with for Primary, and perhaps try 128 for Outline and 224 for Shadow
  6. In the same dialog, set the alignment to be the centre of the screen (5) and then click OK
  7. Close the Styles Manager dialog and save the file
  8. Open your video and use the “load subtitle file” option (for MPC, it’s File menu -> Load Subtitle) and select the edited subtitle file
  9. Hopefully it all works, if not, you may need to go back to Aegisub and try editing other styles
  10. Watch the video and submit your thoughts in the comment box below.  Remember that the last step is always the crucial one

Loud Seagate Drives

I don’t think Seagate has particularly had a good reputation for the noise of their hard drives, but I’m going to tell my story nonetheless.

A while ago, a 2TB WD Green drive of mine started developing bad sectors.  I performed a disk scan, which would’ve marked sectors as bad, fixed the corrupted files and hoped that more wouldn’t develop.  However, it wasn’t long before this inevitably occurred, so I went out and bought a 4TB Seagate to replace it.  Things were fine.

More recently, my 1.5TB WD Green drive came to a similar fate (it seems that all my newer drives have been failing within a few years of usage).  As it so happened, Dick Smith was running a Valentine’s day special, putting on a 15% discount on their hard drives, so I grabbed a 4TB Seagate Expansion drive for $170 (it’s unlikely you’ll be able to find the drive at any store here for under $195, so that was an excellent price) to replace the old WD.

Plug the drive in, and you’re greeted with quite an audible powering up, followed by whirring.  At first, I thought there was a fan inside the enclosure, considering the noise and that there’s breather holes (more like a mesh) on the back, but I couldn’t feel any airflow, so concluded that the whirring is actually the drive itself.  The sound is definitely quite noticeable, louder than the rest of my PC and I can easily hear it from 3 metres away.  I have a 3TB WD Elements sitting right beside it which is almost silent – I can hear the drive when it spins up, but it’s still much quieter than this new Seagate.  Another thing that’s interesting is that, despite my internal 4TB Seagate having the same model number as the external, the internal drive seems pretty quiet; it’s possible that the case is blocking some noise, but even with it open, I can’t seem to hear the drive distinctively above the other noises in the case.

Now whilst I could just get used to the noise, I don’t really want to have to make that compromise.  On the other hand, I didn’t feel like going to the effort of returning the drive and then paying more for a replacement.  So I decided to try tweaking the drive’s AAM/APM settings to see what I could achieve.  Seagate conveniently doesn’t allow you to change the drive’s AAM (or they simply don’t support it, whatever), however APM is changeable.

Most search results on ‘Seagate’ with ‘APM’ seem to be people complaining about Seagate drives making audible noises when spinning down, where they’re looking to disable APM.  I’m a bit surprised that I can’t seem to find anyone complaining about normal operating noise of these when not even being accessed.  As I’m using this only as a backup drive, I don’t mind it spinning up only when it is actually accessed, so turning down the APM value, if it would stop the whirring, could work for me.

HDParm for Windows doesn’t seem to detect USB drives (side note: interesting that they use /dev/sd[a-z] to identify drives, despite being on Windows), but I did eventually find that CrystalDiskInfo would set the APM for the drive.  Changing the default value of 128 to 16 seemed to do the trick – the drive would spin down soon after becoming idle, making the drive silent.  Success!

…except that the drive would reset its APM value whenever it lost power.  Urgh, what to do?

Turns out, the CrystalDiskInfo guys thought of this – the mysteriously worded “Auto AAM/APM Adaption” option basically makes CDI sets the APM value of the drive every now and then (okay, it’s mentioned in the manual, but it’s not exactly easy to find).  This does mean that CDI has to stay running in the background, but as I have 16GB of RAM in this machine, I’m not too worried about that.

The drive does exhibit some “weird” behaviors (well supposedly understandable but still silly) – such as spinning up before you standby the PC, then quickly spinning down.  Also, the Auto APM setting sometimes takes a while to kick in after resuming from hibernate.  As my backup routine is basically a scheduled disk sync, the drive spins up for short periods when this occurs, but it’s a tradeoff I’m willing to take.  One thing to note is that the drive seems to spin up on any activity, even reading SMART metadata; CDI, by default, polls the drive’s SMART info to check for issues, but it’s easy to disable automatic refreshing to avoid the drive whirring up every 30 minutes.

tl;dr if you’ve got a Seagate external, can’t stand the whirring, and don’t mind it being spun down when idle, install CrystalDiskInfo, turn down the APM, enable to auto APM setting, get CDI to load on startup and disable automatic polling of SMART info on the drive.

 

Side note: CrystalDiskInfo provides a “Shizuku Edition” of the application.  As I couldn’t find information on what the difference was with the Standard Edition, I ended up downloading both, curious over the ~65MB size difference.  Turns out, Shizuku is just a anthropomorphised mascot for the application, the size difference being mostly high resolution PNGs depicting her in the theme that comes packed in the Shizuku version (the ‘Ultimate’ version contains multiple copies at multiple resolutions – presumably ‘Simple’ and ‘Full’ don’t contain the higher resolution copies, although one wonders whether multiple copies were really necessary).  The devs even went to the effort of getting the character voiced, which means you get a cutesy voice warning you if your hard drive is about to die (assuming you know enough Japanese to take a stab what what’s being said).
Though despite my enjoyment of moe anime, I’m fine with the Standard Edition.  Hats off for the effort, nevertheless.

Side-side note: The voices from above were encoded using Opus, my favorite audio codec as someone interested in data compression.  Yay for not going MP3.  Now if only they could get those images down to a reasonable size…

Blog Revival!

For the zero or so people who like visiting zingaburga.com you have now seen that my old blog has been put back into action.

I stopped updating this for a while, and when the old server died, and I realised that I was too stupid to keep a backup copy of my server configurations, I didn’t bother with trying to put this blog back up.

So now that I have, are more updates coming?  Maybe – depends on what I feel like really.  Since I’ve just done yet another server move, I decided to set this up again, as I did recently have some thoughts on writing articles to no-one in particular.  (on the topic of servers, cheap $15/year VPSes are really decent these days!)

It’s funny reading my old posts though – relives some old memories but makes you feel a little silly at times.

Moved Server

Helloo~ another rare post from me!

Recently shifted my websites from my old dedicated server to this VPS server – a move which I’ve been too lazy to do for like 2 years.

The dedicated server was rather overkill for the website I’m running (originally had other plans, but didn’t follow them through) so have been paying too much for hosting for quite a while.

This new VPS is a Xen 1GB RAM, 30GB HDD, 1.5TB/mo from ChicagoVPS, using the awesome deal here.  Asked support to increase space to 50GB which they did for only $1.75/mo extra (awesomesauce).  They also agreed to supply a further yearly prepayment discount if I switch to an annual billing cycle, which I plan to do soon.  Been happy with speeds and I/O performance; CPU is a Xeon X3450 (Xeon equivalent of i7 920) so pretty cool too.

Now the fun part: setting the thing up.  Previously using CentOS 5 64-bit, but after using Debian, I somewhat like the setup better, so decided on Debian 6 32-bit for this server.  Server stack software:

Webserver
Running an nginx frontend proxying to an Apache backend, with PHP module.  Historically had issues with CGI/FastCGI, which is why I decided to go with the more familiar Apache PHP module, although the last time I tried FastCGI was years ago.  But nginx was great and allows me to run a minimalist Apache which works well for me.  Also I get the advantage of accelerated proxy responses in XThreads, although I’ve removed all the big downloads I used to have to fit in the 50GB disk space.

Unfortunately, different from my other installs of Apache with PHP module, it seems that Apache was leaking memory on this setup.  Went tweaking a few PHP configuration variables and seems to have magically gone away, me not knowing why.  Nevertheless, I decided on using a higher *SpareChildren configuration and a very low MaxRequestsPerChild to get around any possible memory leaks.  Apache itself only has 3 modules active (some configuration needed to be modified to accomodate this minimalist setup): mod_dir, mod_rewrite and mod_php5

Also have gotten nginx to send HTTP Expires headers, so pages will also load faster (since Firefox won’t be sending check requests waiting for HTTP 304 responses for static files).  But otherwise, configuring two servers is a bit more of an issue, especially with rewrite rules, but manageable.

Database Server
Have decided to go with MariaDB instead of MySQL here.  As with MySQL, the MariaDB defaults are a bit overkill for a 1GB server, so my.cnf needs tweaking.  Unfortunately though, there are many MySQL tweaking articles out there, but I didn’t find any for MariaDB – although MySQL largely translates over, there are parts which don’t.  So configuration took a bit more time and effort to get right.

Whilst disabling InnoDB and tweaking buffers is probably enough for a standard MySQL setup which only runs MyISAM tables, MariaDB includes, and activates by default, a number of other plugins which probably need to be disabled (such as PBXT).  Aria being the new internally used storage engine cannot be disabled, and you need to remember to tweak down the default buffer size in addition to the MyISAM buffers.

Speaking of Aria, I decided to switch all my tables to Aria format as it’s essentially an improved version of MyISAM anyway.  Everything seems smooth sailing so far.

As for database backups, I’ve decided to move away from the mysqldump command I’ve been using for so long.  Although I’d disabled table locking when dumping, so that the website didn’t lock up for 15 minutes during the dump, I’m not sure how appropriate that really is, not to mention that it seems like a lot of unnecessary load.  Considering alternatives, there seems to be only two: mysqlhotcopy or a replicated slave which I can run mysqldump on.  Latter requires more configuration so am considering the former.  However, mysqlhotcopy seems to lock all tables being dumped, which means the site locks up for about a 30 seconds whilst the database gets copied.  I’m not really worried about the downtime, but the fact that requests queue up on the server and quickly chews through RAM is something I do have to take into consideration.  As the mybb_posts table will obviously be the one taking the longest, and locking the table will only really affect new posts, it seems better to lock individual tables and copy, which will probably mean I’ll have to write my own script (or call mysqlhotcopy a few times).  There’s a slight possibility for data desynchronisation between tables, without referential integrity, but I’d presume this is somewhat rare.  Besides, if this really is an issue, it’s possible to group commonly used tables together.

Other
Well the webserver/PHP and database server are the most exciting to configure since they’re the heart of website-server (trying not to say “webserver” again).  Went with postfix instead of sendmail, and the email configuration wasn’t as scary as I thought it would be.  Nothing else particularly worth mentioning otherwise…

Moving the server
Had originally planned to stagger the move.  Firstly moved zingaburga.com over, so I could identify any issues (such as the Apache memory leak).  After that, moving everything else over went pretty quickly, even the EP database (well, I did move all the attachments over before closing down the forums; included with setting the domain’s TTL to 60 seconds, there wasn’t that much downtime).

Unfortunately, the EP tables were defaulting to latin1 encoding.  This seems to have caused an issue as UTF-8 data was stored in them, and the default encoding for this new server is UTF-8.  Which meant hours of downtime, me staying up into the wee hours of the night repairing the encoding.  And then after I did that, I forgot to switch the users table back to textual formats (from binary fields) so no-one could actually log in.  Other bugs which I didn’t have before needed some nginx proxy tweaking but otherwise, everything seems to be well.

Overall, server seems to never be going over 500MB RAM usage for normal situations, so glad I got 1GB for plenty of headroom.  Am also surprised at this relatively low memory usage, despite me being rather generous to buffer sizes, but I guess tweaking pays off.

Too Much Bandwidth (or maybe, just quota)

So, time for another pointless update on myself (well, I may as well post, otherwise this place would be entirely dead).

I’ve posted a number of times before about my internet connection and that, and how you’ve probably figured that I’ll never shut up about it until something like the NBN comes (if it ever will).  But anyway, this might be a bit of a turn.

Right now, I’m on a 1.5Mbps connection with 25GB peak downloads and 120GB off-peak (2am – 12pm) quota per month. (if you’re wondering, the annoying slowdowns have since mysteriously vanished)  Exetel (my ISP) have decided to be a fag and increase prices by $10/month, so their lowest (non-shit) plan is now $50/month.  They have somewhat “compensated” by increasing quotas to 30GB+180GB off-peak (which will become 2am – 2pm), however, I’m already finding it really difficult to use up my current quota.

I’ve looked around, but for 1.5Mbps connections, it seems there really isn’t much cheaper available (thanks to Telstra’s dominance in the area) – probably the most I could save would be $5/month which would also require bundling with a phone.  Oh well.

So, back to the issue of using up the quota.  I guess I don’t really have to, but I guess I’ve developed this idea that I should, and despite myself saying it’s unnecessary, I’m always trying to find something to exhaust the bandwidth.  So yeah… downloading useless stuffs.  Especially difficult with me as I try to be conservative with bandwidth usage.  Am really starting to run out of ideas over what I should do with the quota – perhaps I should convince myself not to bother with it (and save some electricity by not having the computer on at 2am downloading stuff).

Streaming POST data through PHP cURL Using CURLOPT_READFUNCTION

Well, I haven’t posted here in quite some time… I’m not dead, and don’t plan on completely ditching this blog, but well…

Anyway, onto the article.

I had a PHP application where I wanted to upload part of a large file to some other server.  The naive method may be to simply split the file and upload through cURL, however I wanted to do this without any splitting.  So I needed a way to send a POST request, being able to build the request body on the fly (note, you’ll need to know the total size to be able to send the Content-Length header)

The obvious decision would be to use sockets rather than cURL, but I felt like seeing if it was possible with cURL anyway.  Although I’ll still probably use sockets (because it’s easier in the end), I thought this might (well, not really) be useful to one of the three readers I get every month.

Anyway, if you look at the curl_setopt documentation, you’ll see a CURLOPT_READFUNCTION constant, however, how to really use it doesn’t seem clear (especially with the boundaries for multipart/form-data encoding type).  Also, the documentation is wrong.

Without further ado, here’s some sample code:

<?php

$boundary = '-----------------------------168279961491';
// our request body
$str = "$boundary\r\nContent-Disposition: form-data; name='how_do_i_turn_you'\r\n\r\non\r\n$boundary--\r\n";

// set up cURL
$ch=curl_init('http://example.com/');
curl_setopt_array($ch, array(
 CURLOPT_HEADER => false,
 CURLOPT_RETURNTRANSFER => true,
 CURLOPT_POST => true,
 CURLOPT_HTTPHEADER => array( // we need to send these two headers
 'Content-Type: multipart/form-data; boundary='.$boundary,
 'Content-Length: '.strlen($str)
 ),
 // note, do not set the CURLOPT_POSTFIELDS setting
 CURLOPT_READFUNCTION => 'myfunc'
));

// function to stream data
// I'm not sure what the file pointer $fp does in this context
// but $ch is the cURL resource handle, and $len is how many bytes to read
function myfunc($ch, $fp, $len) {
 static $pos=0; // keep track of position
 global $str;
 // set data
 $data = substr($str, $pos, $len);
 // increment $pos
 $pos += strlen($data);
 // return the data to send in the request
 return $data;
}

// execute request, and show output for lolz
echo curl_exec($ch);
curl_close($ch);

Hopefully the comments give you enough idea how it all works.

PMPs – Why do People Ignore Compression?

One thing I notice is that many portable devices, companies sell higher capacity versions for exorbitant premiums, when flash memory really isn’t that expensive.  Seems to be less of an issue for players which do include an (mini/micro)SDHC expansion slot, as you can effectively increase capacity with a cheap add-on card.

But despite this, it seems that many people really do pay these excessive premiums for this increased storage.  I sometimes do wonder how people fill up so much space, eg getting a 32GB player over a 16GB one.  Surely these people have lots of videos and music, probably more than they need, and obviously, a higher capacity player allows them to carry more on the same device.

Whilst this is fine for the majority who aren’t so technically inclined, I do wonder about the people who are more technically inclined, and them overlooking the other side of the equation.  For example:

Amount of music that can be stored = Storage capacity ÷ per song size

Now we want to be able to store more music (again, even if it’s a lot more than we need), but the general approach of simply upping storage capacity is only one part of the equation – most people, even more technically inclined people, seem to ignore the fact that you can also store more stuff by reducing the file sizes of media!

Admittedly, compressing stuff can take effort.  In fact, I’ve had a number of motivations that most probably never had, including the old days of me trying to fit MP3s on floppies, squish as much as I could out of my 4GB harddrive, squeeze music on a 256MB MP3 player, and packing videos onto my 1GB PSP memory stick.  However, with a bit of reading, it’s mostly sticking your music/videos into a batch converter and then copying everything across.  It’s slightly less convenient when you add stuff (you probably need to pass these through a converter too), though, personally, I’m used to doing this, so I don’t mind.

But does compression really yield much benefit?  From what I’ve seen, I’d say so.  It seems most people just dump their 128/192/256/320kbps MP3s (usually more 320kbps as this is a popular size in P2P) on the device and that’s all they care about.  From the fact that most people cannot tell defects in 128kbps MP3s (let’s just say it’s LAME encoded), and my own listening tests, I’d say that most people cannot hear defects in 56-64kbps HE-AAC (encoded with NeroAAC).  Support for this format is limited though (difficulty of implementing SBR on embedded devices), though I believe Rockbox supports it, along with the latest iDevices (pre-late-2009 do not support HE-AAC).  Next in line would be 80-96kbps OGG Vorbis, if your player supports it.  In fact, I cannot personally hear defects in 128kbps Vorbis, so even audiophiles could use a big space saving by using higher bitrate Vorbis.  But support for Vorbis is surprisingly low, considering that this is a royalty free codec.

For an audio format with a fair bit of support, would be LC-AAC (aka “AAC”) which achieves similar quality to 128kbps MP3 at around 96-112kbps (using NeroAAC or iTunes).  Failing that, using LAME to encode MP3s with a variable bitrate can yield decent quality with average bitrates around 112kbps.

Now if we assume that the average song is a 320kbps MP3 and the listener really can’t hear defects in 128kbps MP3s, and the underlying player supports HE-AAC, we could get a massive 320-56 = 264kbps saving (82.5% smaller!) by being a bit smarter in storing our music.  This equates to being able to store over 5 times more music in the same amount of space.  But of course, this is an optimal situation, and may not always work.  Even if we’re more conservative, and say that the average MP3 is 192kbps, and the underlying player only supports LC-AAC, we can still get a 50% reduction in size by converting the 192kbps MP3 to 96kbps LC-AAC, which equates to a doubling in storage space.

Videos are perhaps more difficult to get right as the parameters involved in video encoding is significantly more complex than audio encoding (also note that videos often include audio).  But from what I’ve seen, significant space savings can be gained by encoding videos more intelligently, but it’s hard to provide rough figures as most people do convert videos for their portable devices, but use a wide variety of applications and settings.  For reference, I see a lot of >100MB PSP encoded anime episodes, however, I can personally get them to around 30-40MB using a x264 crf of 25 and ~8MB audio stream (allowing me to easily store a 12 episode anime series on a 1GB stick, with plenty of space to spare).

So for those who don’t compress their media, maybe give it a bit of a shot and see what space savings you can get.  You may be surprised at how much 16GB can really store.

×

Why would anyone buy an iMac?

People who know me probably know that I’m a lot more anti-Apple than I am anti-Microsoft, but that’s besides the point here.

Was browsing some ads that got sent to my house today and I saw an ad for an iMac (as Apple tightly controls prices, I would expect them to be similar across stores) and, seriously quite shocked at what was on offer.  The cheapest system had:

Intel i3 3GHz CPU
4GB RAM (probably DDR3)
500GB Harddisk
256MB ATI Radeon HD 4670 GPU
21.5in screen
MacOSX 10.6

All for AU$1598!  To put this in perspective, my current computer, which I bought in 2008 when the AUD crashed cost me less, and is still more powerful than the above.  This is what I paid:

Intel Core2Quad Q6600 [$295] (FYI: a C2D E8500 was about $285 at the time – comparison with i3)
4GB DDR2 1066MHz Kingmax RAM [$95]
640GB Samsung F1 7200rpm HDD [$89]
512MB ATI RadeonHD 4670 GPU [$119]
Gigabyte EP45-DS4P motherboard [$199] (that’s a rather expensive motherboard BTW)
Antec NSK6580 case with 430W Earthwatts PSU [$128]
Logitech Desktop 350 (basic kb+mouse) [$22]

…which totals $947.  If we add in a 21.5in screen [probably under $200 at the time] and a DVD burner [around $30 at the time], and even add in a copy of Windows (around $200) it’s still significantly cheaper than the iMac today even disregarding the fact that the AUD was worth 60% of what it’s worth today, relative to the USD.  Oh, and yes, my system pretty much beats the iMac in every way, not to mention it’s far more customisable and not as locked down as anything Apple make.

Okay, Apple’s stuff is absurdly expensive, this is probably nothing new.  From what I’ve heard, people may buy Apple stuff for its design.  But is the design really any good?  I personally don’t think so.

Our Uni recently replaced all library computers with iMacs (different to the one advertised, so I may be a little misinformed here) and I really don’t like their design in a number of ways.  After using one for a while, this is my thoughts so far:

The Screen and Machine

  • It’s big, heavy and somewhat cumbersome.  It appears you can only tilt the screen forward and backwards.  Although most screens (especially cheaper ones) don’t seem to be terribly adjustable, I much prefer the Dells in the IT labs, where you can adjust the height, swivel horizontally and rotate the screen itself on the stand.
  • It’s glossy.  I don’t know WTF people make glossy screens.  If I wanted to see my own face, I’d look in a mirror.  If I wanted to see that bright light behind me, which is reflecting off this stupid glossy screen, I’d look directly at it (but I wouldn’t, I’m not that stupid).  But when I’m looking at a screen, I want to see what’s actually on there.
  • I can’t seem to find any controls on the screen.  Maybe there’s some on the back, but I didn’t look too much.  Not that screen controls should be on the back anyway.
  • USB ports.  The last time I used a computer which didn’t have USB ports at the front was made about 10 years ago.  Apple helps you bring back those memories by not putting USB ports at the front (or sides).  As for the back USB ports, the number of them is somewhat limited…
    I did actually later realise that there were USB ports on the side of the keyboard.  I guess that’s a reasonable way to do things, though I still would be concerned whether these ports supply enough power for a portable HDD.
  • Actually, make it that there’s nothing useful on the front or sides of the screen.  The power button is conveniently located at the back of the screen, so if you want to turn it on, you’re going to have to pull the screen forward, and then turn it around so you can reach the button (making sure you don’t pull out any cords), then do the reverse to return the screen to its original position.
  • The back doesn’t appear to have that many ports, though I didn’t check much (not easy to), and certainly looks a lot less than what my Gigabyte EP45-DS4P motherboard supplies.
  • I still haven’t managed to find where the optical drive is…

The Keyboard

  • Is small and flat – very much like a laptop keyboard.  Maybe some people prefer laptop keyboards, but I don’t.
  • Has very little extra keys.  Fair enough I guess, but overall, seems like a cheapish keyboard and hardly anything I’d pay a premium for.  Overall quite usable though.
  • Doesn’t have a Windows key, for all those planning to install Windows on it (the Uni library iMacs all run Windows).  Fair enough from an Apple standpoint I guess.

The Mouse

  • The trackball is quite small.  At first I didn’t like it, but after a while of using it, it seems okay.  In fact, it being a ball allows you to horizontally scroll quite nicely, despite many applications not supporting horizontal scrolling, but I guess that’s not the mouse’s fault.
  • One-button design.  Despite its looks, the mouse can actually distinguish left, centre (the ball) and right button clicks reasonably well, however, only if you push your fingers in the right place.  Unfortunately, as this is a single button design, there isn’t really any clear way to feel where the right place is without looking, apart from finding the ball with your fingers and distinguishing left and right portions from there.  If you push too close to the centre though, you can inadvertently get the mouse to press the wrong button.
  • From the above, you cannot click the left and right mouse button at the same time.  Not important for most applications perhaps, though I know some games require (or can be enhanced with the ability) both buttons to be pressed at the same time.
  • Like the keyboard, the mouse is fairly basic and has no extra side buttons and the like.  Hardly anything I’d pay a premium for.

So there’s my thoughts on the iMac.  Seriously overpriced and badly designed.  Unless you absolutely must use OSX (and unwilling to build a Hackintosh) or just an avid Apple fanboi, I can’t see why anyone would rationally buy this hunk of junk.

New USB Stick

I’ve had a number of USB sticks in the past, and from historical situations, they tend to last around 2 years for me.  My current (well, actually, previous, now) USB is a Transcend 8GB, and I’ve already been using it for over 2.5 years, so I’ve been wondering if this thing is going to die.  Maybe it’s better, maybe it’s just luck, but I decided to leave out that risk factor and get myself a new USB just in case. (yes, I do manually backup data, but backups are only so good)

Anyway, one of the things bothering me with this Transcend stick is the horrible speeds it has.  Running portable apps like Firefox Portable takes forever to load, and saving anything on the USB has a noticeable latency lag.  As USBs are really cheap these days, I decided to look for a faster stick, rather than a large one.  I’m only using around 300-500MB anyway, and rarely go above 700MB unless I’m in the rare situation where I’m transferring some large files (in which case, I don’t mind bringing my USB HDD to do that), so I could easily live on a 2GB USB, perhaps 4GB for good measure.

Unfortunately, it seems all the faster USB drives are also large.  Looking around, the best that appealed to me were the 8GB Corsair Voyager and Patriot XT Xporter Boost from Umart (which now sell for around $25).  Drives like the OCZ Throttle and Corsair Voyager GT I could only find in at least 16GB sizes, which cost significantly more, and I seriously don’t need all that space.

Then I saw that MSY were selling a Patriot Xporter Rage 8GB for $25, so I decided to get one of them.  After some Googling though, I was a little worried on whether it delivered its advertised speed, finding a thread where users were complaining about the 16GB version’s write speeds, also hinting that the larger drives (64GB) may actually deliver on the advertised speeds (and I’m getting a smaller 8GB one).  But anyway, I went ahead and bought it (after they managed to get one in stock) for $24 (yay, $1 saving!).

Bringing it home, it’s formatted as FAT32 with a 64KB sector by default.  I do seem to get around 25MB/sec on sequential writes (woot!).  64KB sector is a bit excessive, but as I don’t really care about space, I don’t mind it.

As for the physical drive itself, it’s slightly smaller than the Transcend, and its capless design, I actually like.  On my old stick, it’s a little slider at the side, which you push forward to push out the USB connector.  On this one, you push the entire back part of the casing forward to reveal the USB connector.  A thing about the capless designs is that applying pressure to the USB port can cause it to retract (a pain if it gets loose and you don’t quite fit the connector in properly), but with the new Patriot drive, you’re naturally going to be applying pressure from the back of the USB stick, so it doesn’t really matter.  Anyway, the outside is also slightly rubbery, though I don’t think the additional grip is much importance.  The thing I don’t like is that it no longer has an indicator activity LED.

So, now that I have a 8GB stick, what to fill it up with?  As this is supposedly a fast drive, I decided to stick some bootable stuff on it, just in case I ever need it (unlikely, but oh well).  I’m too lazy on how to read up on making Linux boot drives, so I just used this and added some stuff that might come in handy – UBCD, System RescueCD and Ubuntu 10.10 (Knoppix and Bart’s PE might’ve been nice; would be nice to have a quick booting text based Linux distro which runs a shell script at bootup – might be useful for quickly performing some offline actions on a PC).

Unfortunately, the formatting process also reverts the drive’s sector size to 4KB, but it seems that Acronis Disk Director, which I happened to have installed, is able to convert sector sizes, so I upped it to 64KB.  First time I tried, it didn’t work (maybe cause I didn’t reboot the PC as it asked me to).  Out of interest, I noticed that Disk Director allowed creating multiple filesystems on a USB (Windows disk management doesn’t allow this), however, it seems that Windows just ignores other filesystems on the drive…  Anyway, reformatted and recreated the drive a second time, upping the sector size to 64KB and it worked.  Except that I got some warnings in the bootloader about the sector size > 32KB.  Despite that everything worked, I decided to just convert the thing down to 32KB for good measure anyway.

So that’s the wondrous story of my new USB, where Firefox Portable doesn’t take forever to load.  Maybe it’ll mean that I take up more space, since I used to stick everything in self extracting EXEs on my old drive (would extract stuff to C: drive and run from there as sequential reads on the USB were reasonable, as opposed to random reads).

Oh, and I’m also running a git repo on there too, with SmartGit as my portable Git client. (tip, you don’t need the full MSYS Git for it to work, just git.exe and libiconv.dll seem to be enough)