Archive | December, 2013

Google Chromecast

Chromecast needs cables and power.

Chromecast needs cables and power.

My son presented me with a Chromecast for Christmas. Chromecast has an excellent out-of-box experience: a slipcase box with a smooth matte finish, four simple pieces and a tiny little manual. Unlike the ads, the Chromecast needs a cable plugged into the rear while the front plugs into the HDMI port. The rear cable powers the unit, since HDMI doesn’t supply power. The cable is a mini-USB, which you could hook up to your TV/monitor if it has USB, or attach the cable the supplied wall-wart power supply. The last item in the box is an HDMI extension cable, just in case your back panel is too cramped to fit the Chromecast directly into the socket.

Setup was drop-dead simple. Power up the TV, select the HDMI input, follow the onscreen instructions to download the corresponding Chromecast extension to your Chrome/Chromium browser, or the Android App and follow the instructions. About the only pain in the neck was typing in the WPA2 impossibly-long password, but that’s not Chromecast’s fault. Up and running!

Chromecast is limited to video-forwarding YouTube, playing Google Play music and videos, proprietary services (NetFlix, HBO GO, Pandora, etc.) and (beta) sharing a single Chrome tab to the TV.
A little searching (on Google, of course) says there’s interesting hacking going on already. The single-core CPU, 2 GB RAM and 512Mb Flash is a dedicated System-On–a-Chip (SOC). It runs Linux, of course. And it appears to share a lot in common with not-so-successful GoogleTV. Hit up your favorite search engines for the latest details, views of the internals, and some interesting reviews.

The only disappointment was reading the license (yes, who reads the licenses?) that tells me the Chromecast includes some Microsoft DRM to protect “content providers” and you agree to let the device be updated to protect the content providers. I’d be happier to opt out and remove the “Play Now” features to make room for something more useful, like an XBMC install 🙂

Broken tag cloud

Broken Tag Cloud

Broken Tag Cloud

I noticed this morning that the tag cloud on my blog’s home page was only three lines long. That’s not right. A little study showed that the three lines were word-wrapping based on spaces within individual tags, and that there was no space between the tags, causing them to run off the right side of the pages, where the overflow was hidden. I poked around a couple of places looking for changes to the code that could have caused this: both the WooTheme’s Canvas theme I’m using and the WooDojo add-on only specify the minimum and maximum font sizes, leaving the default ‘separator’ parameter value. The tag cloud is built up in the wp-includes/category-template.php file, where the default is rather strangely set as “\n” as documented on the WordPress site. Adding an explicit parameter of separator to the Woo elements didn’t seem to have an effect. As a temporary fix to confirm I’m on the right track, I overrode the defaul in the wp-includes file to “&middot” and the word-wrap problem is gone. Next, I’ll see if I can find some other place within the Admin UI and/or the database where the separator is specified and see if I can get it reset properly. Stay tuned.

CDN Syncing!

Magnifying glass

Detective Work

So, I rolled up the sleeves and dug into the web server logs and the code of the CDN synchronizing tool. I found the GitHub site where the code came from, forked the code and created a branch with a couple of different attempts at fixing it. On my third attempt, seem to have a working hourly sync run using the WordPress pseudo-cron functionality. I’ll bundle up my changes and offer a pull request to the upstream developers so they can have the changes as well.

Blog optimization update: WordPress, CDN, Speed, Caching, Accessibility

Keep Calm and Clear Cache

Keep Calm

I’ve continued to do some research on optimizing the blog responsiveness, and I’m pleased with the results. Anecdotal tests this morning, with no local caching, showed a 2 second load time with a 1.2 second DOMLoaded event. That’s pretty good. Here are a few notes on things I’ve been working on:

  • Google’s PageSpeed Tools offered some helpful insights.
  • Minifying some of the text assets – HTML, CSS, and JavaScript – is working well, though I’d like to be more easily able to toggle this for debuggng.
  • Using the Rackspace CloudFiles caching with WordPress lacks a good automated tool on the WordPress side to keep the cache synced with changes. I’ve been using the SuperCache plugin for local speedups, and it supports a variety of CDNs. The CDN-Sync-Tool plugin is no longer available on the WordPress.org site, and several forks on GitHub all seem to be out of date. It’s unclear, so far, where the problem is. The WP cron jobs are failing. Whether that’s an internal configuration problem, or unsupported calls to an old API, I haven’t worked out yet. Next time I try this, I’ll look at some deeper pilot testing for CDNs with better WordPress support.
  • Inspired by “Why Bother with Accessibilty” by Laura Kalbag, part of the excellent 24ways series, I did some initial accessibility testing. The WAVE Web Accessibility Evaluation Tool tests your site for accessibility, an essential feature these days. Accessibility makes your site more understandable and easier to navigate for all users. Disabilities aren’t someone else’s problems; they are a state we will all pass through at one stage or another.There are a few glitches in my templates that I will work to rectify. A larger problem s the observation that my style choices have lead to a rather low-contrast site.

Using SQLite to Bypass the 2 GB .DBF Filesize Limit

Front page to ebook

Click to visit Hentzenwerke

The Hentzenwerke site has been updated, crediting me with editing Whil’s latest ebook, “Using SQLite to Bypass the 2 GB .DBF Filesize Limit.” Whil posits an interesting problem: how to work around the FoxPro 2-gigabyte DBF file limit when the client’s import file balloons in size? In this case, the problem was not that the data had exceeded the limit, but that additional data was included within the import file; more haystack hiding the needles. His solution was to use SQLite as an intermediate step, load in the bloated data, and then cherry-pick the few columns that really needed to be imported for this application. Sample files, instructions on working with SQLite, and example code of importing the SQLite data into VFP are included.
I volunteered to go over his first edition of the ebook and provide a technical review and light edit. I added a few suggestions for alternative techniques, poked at his prose when it got a little awkward, and tested his code and found a few typos. He, in turn, was gracious enough to roll his eyes and ignore my comments. I appreciate him giving me credit as editor on the book.

CDN Glitches

Thanks for all the feedback on the first day running on a CDN. Several issues were noted and addressed.

The fonts have been restored. There were a couple of funny problems that occurred. Headlines are using Asap and body text is Almost Alike, two fonts available using the Google Web Fonts CDN.

The symbol font FontAwesome is, well, awesome. It’s included in the Canvas framework/theme that I’m using, The FontAwesome font is used as a source of small graphic symbols and icons for things like the Search magnifying glass () and the RSS icon(). Canvas embeds the font within their framework in the includes/fonts directories, but when the files are specified in the CSS files, they are referred to with a version number, as in src:url('includes/fonts/fontawesome-webfont.eot?v=3.2.1')

This doesn’t actually load a particular version of the file, but it means that caches will be invalidated and the file reloaded if the version number is changed. However, moving these files to a CDN threw some problems: the CDN is really an object store that returns a file if the name exactly matches the name under which the file is stored. So, the file “fontawesome-webfont.eot” doesn’t match one with “?v=3.2.1” appended to the name, and the CDN doesn’t return the file. [Update: I’ rethinking this. While I was getting “Aborted” error messages downloading the font, today it appears to be working, so this may be a misdiagnosis.]

The solution I chose was to locate the single line in canvas’ font-awesome.css.less file that specifies the source for the fonts, then override that file, using the custom.css file designed for just that purpose, and specify the font-awesome sources as the Bootstrap CDN source. See http://www.bootstrapcdn.com/#fontawesome_tab for their suggested invocation. Using the @font-face declaration of that file, added to the custom.css overrode the early declaration and loads the font, successfully, from the CDN.

Notes from Seacoast WordPress Developers Group, 4-Dec-2013

Seven people attended the December meeting of the Seacoast WordPress Developers group, held at the AlphaLoft coworking space in Portsmouth, NH. The main topic was “Best Business Practices,” which was a great topic but, as always, the conversations and netwokring and recommendations that went on around the main topic were also very helpful and informative. Among those tidbits:

  • The Ewww image optimizer can reduce the size of images and speed webpage loading with minimal quality change.
  • Matt Mullenweg delivers an annual “State of the Word” speech with lots of interesting insights.
  • Open question: What topics would YOU like to learn about? The group is about YOU. How can we get YOU to attend?
  • Which SEO are people familiar with? WordPress SEO by Yoast was the most popular mentioned
  • Question on speeding sites, and a recommendation for the P3 Plugin Performance Profiler

On to the main topic: “Best Business Practices” can easily degenerate into a “Client Horror Stories” session. Kudos to organizer Amanda Giles for keeping a tight rein on the discussions and getting us to focus on covering as much as possible. Andy provided a redacted proposal he had written up for a client and we reviewed and discussed it. There was a lot of good back and forth. Andy had some very insightful item in his proposal that made it clear what the client would see at each phase, what items were optional or deferred to a later project phase, and how client decisions could affect the outcome in terms of schedule and cost. This was a great launching point for a lot of discussion on terms, contracts (my stance: pay a lawyer for a few hours to draft a good contract!), how to handle open-ended items like design reviews and never-ending revisions, terms for stock photos and graphics, and so forth. The discussion was very worthwhile and everyone felt they had their questions answered and learned a few new things. What more can you ask for a meeting?

Our next two meetings are scheduled for TUESDAY (not the normal meeting night) January 7th and Wednesday, February 5th. Please consider joining the Meetup group to keep up on the details on upcoming meetings.

New post testing CDN support on WordPress

DSCN1940 So, the Content Distribution Network is in place and several tests indicate it is working well — page loads are much faster, the URLs of the CDN content are re-written properly — but the next question is whether new materials will be automatically added to the CDN. The picture at the left (and yes, this is an excuse to post a cute dog picture, too) should appear with a link to the high-resolution (1.8 Mb) image. On the blog itself, that link should be of the format http://blog.tedroche.com/wp-content/uploads/2003/12/NameOfPicture.jpg, while if the picture is picked up by the CDN synchronization software, it should upload to the CDN and the URL be rewritten to http://static.blog.tedroche.com/etcetera. Let’s try it out and see what happens…

Woah. Success first time. Pretty cool.

Some details on what I’ve got set up: I’m using the Rackspace Cloud Files service as the CDN. I had worked with Rackspace before on some hosting projects, and have a friend working there, so I thought I’d try them out first. It appears that their CDN services are in an early stage and don’t have all of the features of soe of the more mature products. In particular, it appears that the blog software is reponsible for pushing any new or updated content to the CDN. By contrast, the Amazon S3 offering has an ‘origin pull’ feature that will pull content from the original source when it is first requested, and subsequently cache it.

In order to get the contents of my local blog to sync with the CDN, I added the CDN-Sync-Tool plugin. A lot of web searching seemed to indicate I could find this in the WordPress Plugins directory online, but the tool has been pulled from the directory. Apparently, it is undergoing some redevelopment. The version I found was on GitHub under https://github.com/WDGDC/CDN-Sync-Tool and installation was not more complex that downloading the ZIP and unzipping it in the plugins folder. Bear in mind that you should be comfortable with using the command line and have the skills to review the files you are installing on your machine, as there has been no review by the WordPress folks, and the code is currently under development and you may need to deal with bugs, incompatibilities and support problems. So, this isn’t the path I’d recommend for less-technical WordPress developers, and likely isn’t the path I’d recommend for a client looking to put a CDN into production use.

Note that most cache programs and their CDN features are set up in such a way that logged-in users may see a slower site, but more up-to-date, site, and that in order to test caching you’ll need to log out of your WordPress session.

 

WordPress files now served via CDN

In CorporateSpeak, we’d post, “In our continuing efforts to improve our service delivery and exceed your expectations and delight you with our experience…” but I’d rather speak plainly.

I’ve implemented a CDN – a content delivery network – to speed up the blog performance. The blog, running WordPress, was already using WPSuperCache, (updated link) but all of the responses would still be coming from the same web server. Using a CDN offloads the delivery of static content – images, CSS files, JavaScript – from the web server, and onto a high-speed network that’s tuned for optimal delivery in the fastest possible time.

I uploaded a large batch of files to the CDN to seed the cache. I will also need to set up another plugin to synchronize changes to the blog and upload those to the CDN.

Let’s see how this work. Let me know if you see any funny business.

Powered by WordPress. Designed by Woo Themes

This work by Ted Roche is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 United States.