Menu Sidebar
Menu

The fall of Oyster and Scribd: Subscriptions might become interesting again

Interesting Tweet-storm (or at least that is what I think it is called) from Fahranheit Press, a fine purveyor of Crime Fiction.

I encourage you to read the rant, as it’s a good indicator of the state of subscription services for eBooks.

Below is a good summary of what happened.

No publisher is going to turn down terms like that. So when a VC-backed entity like Oyster or Scribd says “okay you win, we’ll starve ourselves,” all the oxygen leaves the room for non-terrible discussions.1 It effectively set back any reasonable business terms on subscriptions for the past 2-years.

Now that Scribd is circling the drain and Oyster is effectively gone, things get interesting again for the rest of the players to see if we can move this industry in the right direction, with business models that benefit the entire food chain.

As for the staff of Oyster, congratulations on being “acqui-hired” by Google.  There are definitely worse fates than that.


  1. By “non-terrible”, I mean business terms where the distributor doesn’t lose their shirt in the transaction. 

Ad-Blocking: users are revolting

There has been a lot of great writing across the Web regarding advertising and the ethical nature of ad-blocking software with the recent release of iOS9 and its new Safari Content Blocking features.  Unsurprisingly, ad-blocking software quickly rose to the top of paid apps on iOS.

Content providers quickly responded:

image of cnet blocking mobile safari

Is this what we really want?

I’ve been running ad-blocking software for years.  It’s one of the first things I setup when I download Chrome or Firefox.  I do this as a public service to other people’s computers too.

This has been coming for years.  Advertisers had been lulled into complacency with the advent of mobile browsers.  If mobile Safari or Chrome for Android had these features built-in from the very start–we’d probably would have seen the emergence of native ads sooner.  Apple’s own ad-platform probably would have taken off.  The fact that it’s taken them this long to realize it is their fault.

I know that this is at the detriment of content creators.1  Working in the digital publishing space at Kobo, I understand how consumers don’t understand that digital production and delivery represents only a small part of the production life-cycle for books and journalism.  Just because the consumption channel is digital doesn’t mean that there is a massive savings for the content producer.2

Ben Thompson who notes:

This didn’t happen by accident; to BuzzFeed founder and CEO Jonah Peretti’s credit, BuzzFeed was built from day one to be abusiness that earned money the old-fashioned way: by being better at what they do than any of their competitors.

Publications that seek to imitate their success — and their growth — need to do so not simply by making listicles or by focusing on social. Fundamentally, like BuzzFeed, they need to start with their business model: the future of journalism depends on embracing what far too many journalists are proud to ignore.

And Seth Godin absolutely nails the current consumer ambivalence to how hostile ads have become3:

And advertisers have had fifteen years to show self restraint. They’ve had the chance to not secretly track people, set cookies for their own benefit, insert popunders and popovers and poparounds, and mostly, deliver us ads we actually want to see.

Alas, it was probably too much to ask. And so, in the face of a relentless race to the bottom, users are taking control, using a sledgehammer to block them all. It’s not easy to develop a white list, not easy to create an ad blocker that is smart enough to merely block the selfish and annoying ads. And so, just as the default for some advertisers is, “if it’s not against the law and it’s cheap, do it,” the new generation of ad blockers is starting from the place of, “delete all.”

This problem will only get worse for content publishers that rely on today’s advertising platforms to generate revenue.  Keep in mind that the companies advertising their wares aren’t going to hurt; they only pay for ads that are seen, not ads that are blocked.  It’s the companies with large active user bases who are willing to monetize them are going to win out.  The Facebooks, YouTubes and the Snapchats are going to see a pretty big increase in the advertising spend of companies–from a limited pool no less.4  The kicker is that they don’t even produce any of the content themselves.5

On the content publishing side, we’ll see a lot more consolidation as publishers-of-all-sorts begin to realize that their captive audience is too small to fund their production-line.  Book publishers will also need to re-evaluate the entire creative life-cycle; everything from how agents work with up-and-coming authors, to how content is produced, marketed and sold to consumers.  Thats a scary proposition for industry that has remained remarkably unchanged over the last century, but for those leading the charge, it’s quite exciting.  For me, I’ve only been involved in digital publishing for less than a decade and I’m still amazed by it.

 


  1. This doesn’t keep me up at night. 
  2. I’ll be the first to admit that there is probably some fleecing that is going on at the big 5.  They over-value their back-catalogue in an age where everything is about the “now”.  There is also some disruption is occurring in this space that indicates what the market is willing to bare (for better or for worse) with regards to the pricing of eBooks–most notably in digital-only, self-publishing platforms like Wattpad, Kobo Writing Life and KDP
  3. If there is one problem that I am trying to solve at Kobo it is this quote from Godin: “Commodity products can’t expect to easily build a profitable ‘brand’ with nothing but repetitive jingles and noise.” 
  4. Who knew that the total US spend for advertising has held steady at ~1.29% GDP? Source.  All retailers have done is shift from one channel to another. 
  5. In the next 2-years, Facebook will begin producing their own content.  They will probably acquire BuzzFeed or something. 

Product management fundamentals: The next feature fallacy

Joshua Porter writes:

When your product is growing and ramping up new customers, it’s easier to focus on new compelling features that increase engagement.  It’s also easier to ignore dissatisfaction with the increasing base of existing customers because your growth rate exceeds your churn.

Things start to fall apart though when your growth starts slowing down.  It’s easier to focus on new exciting features that you think will turn back the tide and you fall into what is mentioned above.  It makes sense in hindsight–you and your team are used to the pace and cadence that comes with new feature development. The problem though is that the reach of the feature becomes smaller over time. Features that assume a specific level of engagement will, more often than not, fall flat because discovery of the feature will never be 100%.  If you’re lucky, it will reduce churn.  It will not increase growth.

Lifting up the covers and opening up the closet often reveals things like dust bunnies and skeletons.  No one, and I mean no one, likes to work with that stuff, but it’s a necessary part of building a great product or service.

Andrew Chen writes a great response to the tweet entitled, “The Next Feature Fallacy: The fallacy that the next new feature will suddenly make people use your product“.  It’s a great read and I especially like this quote:

How to pick the next feature
Picking the features that bend the curve requires a strong understanding of your user lifecycle.

First and foremost is maximizing the reach of your feature, so it impacts the most people. It’s a good rule of thumb that the best features often focus mostly on non-users and casual users, with the reason that there’s simply many more of them. A small increase in the front of the tragic curve can ripple down benefits to the rest of it. This means the landing page, onboarding sequence, and the initial out-of-box product experience are critical, and usually don’t get enough attention.

It’s a great read.

Concentrate on the things that matter.  Fix the stuff affecting the majority of your customers today.  Get your analytics up and running so that you understand your customer life cycle.  Most importantly of all, make sure that everything you do continues to drive towards the vision you have for your product (and it’s okay to change and pivot if you really have to).

-T

Hackintosh thoughts

File this under the the First-World-Problems Dept.

I have owned and used Apple computers since 1996. Here is the list:

  1. 1996: The first was shared with my brother, an Apple Performa 6400.
  2. 2002: iBook G3 600 MHz
  3. 2007:  15-inch MacBook Pro, Core 2 Duo (Santa Rosa)
  4. 2009: Late-2008, 15-inch MacBook Pro, Unibody

I’m generally happy with my experience, although it hasn’t been a smooth ride…the iBook G3 had a smelly keyboard and a DVD drive that wouldn’t stay closed.  My MacBook Pro Santa Rosa need a power-inverter replacement, fan replacement and the firewire port didn’t work–it was a lemon that Apple graciously replaced with a late-2008 MacBook Pro whose network port failed the first time I plugged it in at Kobo.

Aside: I think it was the network there…from what I know, at least two other late-2008 MacBook Pros were affected.

I still use my MacBook Pro (with upgraded 8GB ram, 256 GB SSD with a 2nd hard drive in the original combo drive slot). I’m amazed that I’ve been able to keep it running this long.

This doesn’t include the slew of work computers that I have had (MacBook Pros, MacBook Airs, etc.).  Two of which exhibited overheating, but I digress–this wasn’t suppose to be a post about my poor experience with Apple Hardware.  I love the stuff.  Nothing, next to Lenovo ThinkPads, come close to the build quality that Apple puts out (but the ThinkPads are butt ugly).

The reason why I am writing this is that I have an itch again to build a new computer.  In Mid-2013, I built an ESXI Whitebox to experiment in some hardware virtualization. I recently pulled that box out of the basement and handed it to my brother because I wasn’t using it.  Recently, I’ve been looking at my Hackintosh and when I do that, I often think of my long history with Apple hardware and software  and the underlying motivation I have to build them rather than just buying a real mac.

In 2009, I convinced Jen that I could build a Mac myself using some Hackintosh guides.  I built a nice Quad Core Q9550 machine.  Three years later, I upgraded my Hackintosh build based on an i5 5370K.  I still use this today in my office as my photo workstation.

Running a Hackintosh is not without its faults.  My video card will freeze and lock up the computer.1  I’ve never bothered to get sleep working (although I know it can work).

It’s more cost effective than buying an iMac if you already have a good monitor, keyboard, etc, but generally more of a pain in the ass to maintain.

After briefly flirting with ESXI on an AMD 8350 build, I’m itching to build another Hackintosh again.  The biggest change in the “scene” is the emergence of Clover EFI Bootloader. Other than that, I see the same issues that I’ve dealt with for the past 6 years:

  • Sound doesn’t work (get a USB sound card…)
  • It won’t boot (check your hardware configuration, boot flags, .kext files)
  • Power management doesn’t work
  • System updates borked the install
  • Facetime and iMessage doesn’t work

All things that are easily troubleshooted–much easier if you use a vanilla-based install from a legitimate Mac.

I don’t think cost is much of a driver anymore in the Hackintosh scene.  Six years ago, Mac hardware was at a significant premium, but the gap has mostly narrowed.  It really comes down to the folks who want a Mac that is more powerful than the Mac Mini, but not tied to a built-in monitor that the iMac has.  Count me as one of those users.

However, it’s 2015 now and even the top-of-the-line Retina iMac is only ~13% faster than the comparable Retina MacBook Pro.  That’s barely above the threshold of noticeability.  In some cases, the iMac performs better than the Mac Pro.

This is in stark contrast to the newest Mac Mini with its max CTO configuration (a dual-core i7) performing at 50% that of the iMac2.  My current Hackintosh, when over-clocked, is only 15% slower in comparison to the latest and greatest.  Not bad for a 3-year old computer.

Mind you, the Hackintosh scene is pretty small (I would say that we’re talking about thousands of people…) and I doubt that Apple will ever do anything to stop people from building them, but you got to wonder if this is even worth it anymore?

Based on what I’m seeing, the only real spot where I see Hackintoshes being relevant is if you do audio engineering or movie editing and you need to supply your own hardware.  There are some use cases for 3d rendering, more so if you are willing to spend for a workstation graphics card.  Alternatively, if you want to explore the platform, but don’t have access to Apple hardware, a Hackintosh is a good option to explore.

I can’t even recommend the dual-boot option.  It’s easier to get separate Windows computer if you want to do some gaming.

Will I build another?  Doubtful. I think I’m past that phase of my life. Should I retire my Hackintosh…maybe. Hard to say whether I go with a Retina MacBook Pro or the new 5K iMac.

Tai


  1. I have since rectified this by installing another video card. 
  2. In multi-core benchmarking.  Single core performance difference is negligible. To be honest, I’m kind of disappointed. 

Lightning does strikes twice – Linux and Git

We all know that Linus Torvalds is the father of the Linux kernel. It’s the guts of an Operating System that can be found powering a multitude of devices, from the majority of smartphones and tablets (android), the majority of servers that power the Web, the embedded OS for the Internet of Things (IoT), Smart TVs, Kobo eReaders and even some PCs and Laptops.

What many forget is that Linus is also the father of Git, the most widely used  source control management tool in use today developers all over the world.

While it’s easy to argue that the Linux Kernel will be what Linus is remembered for–there are other players that have also contributed to Linux’s success.  It could be said that the broader impact he has made to computing and development will be Git (which just celebrated it’s 10-year anniversary this week).

In my mind, they are accomplishments of equal scale.  That is just a rarity.  Simply amazing.

HP T610 Plus and pfSense

When I had set up my Watchguard Firebox x550e, I replaced the two 40mm fans with silent models.  I also swapped the PSU with a 90W pico PSU to make a nearly silent system.

One of the replacement fans gave out and starting grinding a few weeks ago, so last week I replaced it with an HP T610 Plus coupled with an Intel i350 T2 dual-gigabit ethernet card.

Default Install pfSense 2.2 and no additional config flags.  It runs at 17-19 Watts idle.

The T610 has 4GB RAM, 16GB MLC SSD, and an embedded 1.6 GHz AMD G-T65N dual-core processor. So it handles all my needs without breaking a sweat.  60 Mbit of VPN traffic (one way) barely even registers.

I haven’t done any iperf tests yet, but it should be equivalent to some of the dual-core Intel Atom boxes people use. So probably ~750 MBit and 150+ MBit over VPN.

Tai

Thoughts on 2014 and 2015

So I’m starting to see end-of-year wrap-ups and predictions for 2015.  It’s always good to take a look a back on what is happening in the industry, epecially at Kobo.

Largely, a lot of the stuff I am citing is predicated on Mary Meeker’s “State of the Internet, 2014 ed” that she puts together for KBPC (May 2014). If you haven’t gone through it, I recommend that you do.  It’s a good primer for some of the stuff espoused in the predictions for 2015.

Ben Evans from Andreessen-Horowitz has a great presentation called, “Mobile is eating the world.”  It was released in Oct 2014 and presents an interesting stack of data that basically confirms some of the forward looking trends in MaryMeeker’s report.

The competition for consumers time will become more ferocious.  The rise of messaging platforms is indicative of this–the “sipping” of conversation and attention spans will increasingly favour short-form content and will be a big challenge for Kobo (and our competitors) as we require a magnitude-higher level of engagement to consume our content.  I’m reminded of this NYT op-ed from 2012: “The flight from conversation.”

That said, the quality of short-form content may well be improved in 2015.  Well-written and well-designed content is starting to pop-up, with places like Medium and Quartz leading the way.  These networks will be the ones pushing innovation on the discovery problem and may lead to some interesting applications in the stuff my team design and manages.

On the hardware side, all I see is “sensors, sensors, everywhere”…with the Samsung Galaxy S5 pushing 10 different sensors (Gyro / fingerprint / barometer / hall (recognizes whether cover is open/closed) / RGB ambient light / gesture /heart rate / accelerometer / proximity / compass).  All of this has the potential to be collected, mish-mashed into something useful (not sure what this is… yet).  Is quantified self really a new industry are we navel gazing?

I don’t expect this to change much in 2015, although I feel we’ll be hitting the “trough of of disillusionment” very shortly, as companies will struggle to bring meaning to all the data collected and the diminishing returns/insight/usefulness consumers will see will probably trip some alarm bells with regards to privacy and security.  In addition to all of this, I feel that the industry is really just waiting for what Apple will do.

Steven Sinofsky, of Windows 8 fame, penned an interesting op-ed for re/code:  “Forecast: Workplace trends, choices and technologies for 2015”.  Not necessarily applicable to me at Kobo, but you see the same trends beginning to move into the enterprise space.

While 2014 was a banner year for Kobo, the competition around eReading continued to be fierce. Oyster and Scribd entered the market offering a distinctly different business model. These companies continue to see success with acquiring book rights and continuing to grow their userbase off VC-backed capital. Wattpad continues to operate without any strong competitor in the serialized, self-published space, capturing the next generation of heavy readers and authors. New eReading startups like Aerbooks and Glose are entering the market offering differentiated experiences touting different solutions in order to address the persistent discovery issue of, “What to read next?”

Incumbents did not site idle as well. Google Books has incrementally improved their android experience, offering parity with Apple, Amazon and Kobo offerings and improved non-fiction reading experience. Amazon, with amazing agility, rolled out it’s Kindle Unlimited program to match suit with the likes of Oyster and Scribd. Apple’s bundling of iBooks with iOS8 has further eroded the iOS platform for virtually every eBook retailer on the market. Latest reports indicate that the bundling of iBooks into iOS is adding as many as 1 Million new users a week. Barnes & Noble continue to play in the market, but saw little, if any product updates and I suspect will most likely be a non-factor in 2015.

On a side note: if you haven’t listened to “Serial”—I wholly recommend it.  It’s great story-telling.

Review: Into the Woods Film (2014)

I want to thank Ben and Sara for taking the kids out to Disney on Ice at the Rogers Centre this past Saturday. It’s been a while that Jen and I spent time together alone and we decided to watch Into the Woods at the Don Mills Cineplex VIP1.

Now, I’ve seen a stage production of Into the Woods at the Stratford Festival back in 2005 with Jennifer and Jason and enjoyed that particular staging. It stayed very true to the original 1987 Broadway production with Bernadette Peters as the Witch and Joanna Gleason as the Baker’s Wife.

The Hollywood transfer of the musical was well done.  It enhances some the setting by providing a luscious backdrop for some of the songs (in particular, “Agony” was over the top — total props to Chris Pine and Billy Magnussen for arguably stealing the show).  However, I feel that it missed its mark somewhat when compared to the musical.

Ultimately, I think of the original stage production as a meta-fable, where the moral of the story is that there are consequences to your decisions in the real world.  That how I saw it at least.  I think the film “misses” by underplaying this.  It doesn’t give it time for this tenet to gestate.  The curse is broken in Act 1 and the Baker’s Wife is magically pregnant–cut to the speech at the castle with the prince and his new wife and begin Act II.  There are small, subtle things in the original staging that implies that the characters are not 100% happy.  The strain on the Baker and his Wife’s relationship with their new son, him shirking his responsibilities, Cinderella’s unhappiness with royal-life–all things that add a bit more tension.  The removal of the Giantess’s exposition, really just made her into a B-Movie monster, whereas in the musical, you get to understand how much she has lost due to Jack’s actions. Agony’s reprisal in the 2nd act underscores Prince Charming’s daliance with the Baker’s Wife and Sleeping Beauty.  It makes the emotional betrayal that Cinderella feels even more impactful.

Every decision we make has consequences, both good and bad.  We need to grow up and accept responsibility.  These themes didn’t carry over as well as they did in the musical vs. the film.

There are few other small things that didn’t transfer well from the theatre to film.  Much of the dialogue, especially the pauses, did not transfer over at all in the film.  It made some of the more humorous moments just fall flat. There needed to be an audience to play off of.

That said, some of the changes were very well done.  Anna Kendrick’s scene on the staircase made way more sense as an internal monologue than it did as a conversation with the Baker’s Wife. Agony was over-the-top (but I wish they had done the reprisal performance because the first was so good!). Billy Magnussen showed amazing comedic physicality.  I didn’t miss the elimination Rapunzel’s storyline all that much.   Chris Pine showed remarkable range, from charming to smarmy–and he can sing too.

Meryl Streep as the witch did a good turn (especially since she has to go up against the likes of Bernadette Peters, Vanessa Williams, and Donna Murphy).  Although I wonder if they should have went with Bernadette Peters, who originated the role.

Overall, I walked out of the theatres with a 7.0/10 rating.  After Jen and I had time to dissect it a bit, it’s definitely a 6.5/10 for me.

Rottentomatoes.com seems to agree with me as well.


  1. For those who don’t know what VIP is, it’s a luxury, adults only, line of cinemas from Cineplex. It’s nice and at a the price premium ($25 a ticket), but you order your food while seated and they bring your order directly to your seats. There is also a lounge area where you can order dinner–so in retrospect, it’s really a 1-stop, movie and dinner experience. 

HP T620 thin client

[UPDATE – 2014/11/26: Made a few updates on the hardware.] [UPDATE – 2015/12/23: I ended up taking the Wireless N / Bluetooth combo card from the T610 and putting it in the T620]

I’ve been fascinated with repurposing PC thin clients.  I like them because they are  virtually silent and very energy efficient1.  I’ve used one for pfSense, and another as an XBMC box (now called a Kodi Media Centre). They can be acquired pretty affordably as organizations that have invested in these boxes usually swap them out at a steady pace (2-3 year leases).

Earlier thin clients were based on more exotic hardware (embedded CPUs from VIA, Cyrix, AMD), but modern clients use embedded SOC versions of mobile x86 CPUs.  We’re talking full-on, dual- and quad-core AMD-based APUs or even full-out Intel Celeron/i3/i5 chips with Intel HD graphics. All this buttoned-up into a custom mini-ITX or mATX form factor with a included DC-to-DC power supply.

I managed to pick up an HP T620 Plus on eBay for less than $200 CAD. This model was released last year and features an embedded AMD “Kabini” processor (GX-420CA), with 4GB Ram, AMD HD8400 graphics + a fireGL 2270 video card.  It’s powered by a 90W pico psu with a heat-pipe CPU cooler and a low-RPM fan. The FireGL card can easily be repurposed for better graphics or networking.  It is virtually identical to the AMD A6-5200 and AMD Athlon 5350 in terms of performance and features and about twice as fast as the AMD e350-based HP T610 thin client that I am using for XBMC.  This should be able to transcode a single 1080p stream in realtime.

A few things to note:

  • Storage is mSATA only. I’ve paired it with 128 GB crucial m500 SSD.

  • UPDATE: There is also a M2 NGFF port available.

  • UPDATE: In addition to the 2 x USB3 and 4 x USB2 ports on the back and the front, there are also 2 x USB headers inside for flash storage, Bluetooth, WiFi, etc.

  • The onboard graphics uses 2, full-sized display ports. This particular model came with a working FireGL 2270 card.  Not very useful and I’ve already removed it.

  • If I use the box for pfSense, I’ll add an Intel GigE dual-NIC

  • I might add a Gigabyte GB WB300D WiFi and BT 4.0 card.  See my note above.

  • The onboard GigE port is no longer Broadcom-based. It’s a cheap Realtek controller (RTL8111/8168/8411 rev C)

  • The PCIe expansion bay only accepts low-profile cards. This is a pretty significant difference from the previous version.

  • 2 serial ports + a Parallel port.

  • The second serial port can be rewired to a VGA connector using a 15-pin VGA header cable.  I am fortunate to have a spare that I tried to add to my Watchguard Firebox x550e box.

  • UPDATE: The VGA connector uses a small 16-pin port that I have never seen before.  I haven’t located a cable yet (best I can fine is a small 12-pin port VGA header cable)

Add some storage (this one had a bad mSATA drive) and a display port to HDMI adapter and you have a complete system that is basically the same as AMD AM1 Athlon 5350 build. For less than $200 CAD, I certainly couldn’t build an off-the-shelf unit for that price.

The bios on these thin clients are very bare-bones.  Don’t expect to over clock the systems as there doesn’t appear to be any means of OC’ing the chips.

This will most likely replace my newish T610-based XBMC computer.  The great thing is that some parts are interchangeable,  I have a spare Bluetooth and WiFi Mini-PCIe adapter from my T610 that I can reuse for instance.  Not too a shabby of system and I’m excited to put it through it’s paces as an XBMC client or as a pfSense router with AES-NI support.

Here is a readout of “lspci”:

It runs pretty cool at full load, a Prime95 Torture test of all four cores maxed only pushed it to 65˚C (23˚C ambient).

[UPDATE – 2015/12/23]: Right now, I am using the box as a Zwiftbox.  I upgraded the ram to 12GB and added a R250 video card.  Runs Zwift at an acceptable 15 FPS at 1080p.


  1. 15 to 18 Watts 

Forbes.com reviews the Kobo H2O

Jordan Shapiro writes 3 Reasons Why Kobo’s Aura H20 is the Perfect Luxury E-Reader:

 Kobo is the quiet Kindle competitor–the underdog in the eReader market. They released their most recent premium eReader at the beginning of October. I’ve been reading on the Aura H2O ever since. I sometimes use my Kindle Paperwhite when I have to read an eBook I bought from Amazon, but I prefer the Aura H20.

I believe this is the first product to have inspired thoughts about French philosophers and epistemological constructs.

Probably won’t be the last.

Hats off to the team for building the [best luxury eReader on the market][2].

Newer Posts
Older Posts

Pixels & Widgets

A blog by Tai Toh