Silicon Isthmus

So there was this article titled “Silicon Isthmus (San Jose. San Francisco. Seattle. Madison?)” in our local newspaper the Isthmus. The introduction follows:

When most folks think about computer software and where it’s developed, Madison isn’t the first city that springs to mind. More than likely, cities like San Francisco or Seattle, home of software Goliath Microsoft, top the list.

Think Again. The West Coast may have the ocean and the weather, but when it comes to the software development scene, Madison is in a league with the big players.

According to the Greater Madison Area Directory of High Tech Companies, an annual guide published jointly by the city of Madison and Madison Gas & Electric, there are dozens of local companies churning out cutting-edge software in fields as diverse as biogenetics, entertainment and business solutions.

Of course it’s hard to have a successful software company without the person who has the ideas, the technological know-how, or both. Madison’s a chip off the silicon block in this department too. More than half of the aforementioned software concerns began as shoestring operations, the sweat of an inspired entrepreneur’s brow.

Many of the entrepreneurs were born in Wisconsin, and most of them set up shop here for the same reasons other people do: a relatively low cost of living, more than a modicum of cultural and political goings on and a stable business climate.

The interesting thing is that article is from 1996 profiling my company Sonic Foundry, game software company Raven and a few others.  It goes on to talk about things like the difficulties of hiring tech talent in Madison, I’d bet that many Madison techies would assume it was written today instead of 18 years ago.

The more the things change, the more they remain the same.

No Bitcoin Campaign Donations For You!

Mark Clear who is running for the Assembly in Wisconsin held a fund raiser on Wednesday. I’ve known Mark through the start up community in Madison so when the Facebook invite showed up I figured I should head over to the “Beer with Clear” event to show my support.

Lately here in town I’ve been “that guy” when it comes to Bitcoin. Whether I’m getting a coffee, a haircut, or paying my office rent I’m always asking “Do you take Bitcoin”? I’m amazed how it’s usually met with curiosity and lots of questions. So of course when I arrived at the donation table I jokingly said “So are you guys accepting Bitcoin”? I wasn’t expecting the answer from the candidate, “We can! I’ve got a wallet on my phone.” I probably should have expected that Mark being a Tech guy would have a Bitcoin wallet.

I told Mark that I’d be happy to donate $100 worth of Bitcoin and we initiated the transfer. Within seconds the transfer for .2213 Bitcoins appeared on Mark’s phone evoking that empowered feeling that many Bitcoin users know so well. Realizing we may have just transacted the first Bitcoin campaign donation in Wisconsin we happily tweeted our interaction.

Well it only took 12 hours to find out no can do. It turns out that just about 6 weeks ago the Wisconsin GAB had taken up this very issue. See here

So it’s good to see the GAB is trying to deal with Bitcoin but much of what is in the minutes is either technically inaccurate or misapplied. Unfortunately this meeting also took place around the time of the Mt. Gox/ Transaction Malleability fiasco. There is much discussion in the report on Transaction Malleability and how it throws doubt on the process. This is technically incorrect. Although TM allows the ID of a transaction to change before it is placed in the block chain, once the ID is in the chain it is immutable.

From the GAB minutes:
“While core information in a transaction ID cannot be altered, additional information that would be critical to any committee (such as address and employer information) could be changed through transaction malleability.”

The problem here is that additional information is not stored within the Bitcoin system. Campaigns would just enter a bitcoin transaction ID in place of the information they use for a monetary donation (check, Visa etc). So even if TM was a problem (which it isn’t) it’s not relevant as donor information is stored separate from the transaction.

In fact using Bitcoin for transactions is actually more transparent than any other means of donation. Had we done this via check or credit card this transaction wouldn’t be a public record like it is with Bitcoin.

Here’s the transaction with me sending .2213 bitcoins to Mark

Less than 24 hours later Mark returned the .2213 coins to me here

The FEC is currently deciding on Bitcoin donations. I hope if nothing else they’ll treat them like cash and allow donations of up to $100. I searched the Wisconsin GAB site but couldn’t find the cash limit there. My understanding is it’s $50. In the meantime I’m forced to use PayPal, Credit card, or check to contribute to Mark’s campaign all of which cost me more in fees and time than our simple 15 second transaction the other night.

Here’s a link to the local story on the transaction


One of the things I find fascinating is the titles that founders assign themselves. Why is the first thing thing on the agenda after coming up with an idea for a company is to assign C-Level titles to everyone involved? Here’s some observations I’ve made based around start-ups and those titles.

If three technical people start a company you’ll often find the guy who ends up being CEO is the least technically able of the bunch. Why? Because the CTO is almost always going to be the most technical of the group and probably the one with the idea for the company in the first place. When the time comes to raise money the three people look at each other and go “hmm we need someone to spend time talking to investors and creating presentations.” Usually what will now happen is that the least technical person offers themselves up to do this role. If they don’t then it will be forced on them by the other two using the logic that the most technically able people should be working on the product. Once the CEO and CTO titles are assigned usually the other person will get the title of COO. I honestly don’t know why people think a Chief Operating Officer is the next most technical person but for some reason this just happens.

Here’s where this can really go awry. I’ve seen in more than one case where the person who was chosen to be CEO just doesn’t have the skill set needed. They’re not as knowledgeable about the product, they’re often not as passionate about it, and in the worst cases they have really poor people skills. Many times I’ve been in a meeting where the person with the CTO title literally runs the meeting. They’re friendly and outgoing, they answer all my questions while the person with the CEO title sits back and defers all questions to this person.

What does this tell me about your company? You’ve got a really bad day ahead of you. It’s the day when the CTO finally wakes up and realizes that the CEO title isn’t right for the person to whom it’s assigned. More than once I’ve had this hard conversation when they’ve come to me asking me what to do. Unfortunately there’s no easy answer here. How do you go to a person, probably your friend, and explain to them that they aren’t cutting it as CEO? Now maybe you’ll get really lucky and they’ll know that they’re not right for the job. There is the slim possibility that they’ll actually be relieved that someone else besides them realized this and is saving them from drowning in the role. But the more likely outcome is that you’re now going to have to ask someone to either leave the company or take a demotion to a different role. Usually the latter turns into the former anyways.

CFO. Do yourself a favor and don’t ever end up with this title in your startup. What the hell are your responsibilities in a 2 or 3 person company as Chief Financial Officer? Are you the guy in charge of the quickbooks account? When I’m introduced to a company with less than 10 people who have a CEO/CTO and CFO I immediately jump to the following conclusion. CEO/CTO are the technical guy and the business guy or perhaps two technical guys with one of them being the leader. The CFO is the roommate or buddy who was with them at the bar when they came up with the idea. Sorry to be the bearer of bad tidings but if you’re walking around with the title CFO in that size of company you’re probably dead weight.

My suggestion is in the early stages of your company it’s best not to assign these titles that really don’t mean anything. There’s nothing wrong with being founder or co-founder. In startups you should all be working together towards a common goal anyways. By not assigning those titles early on, you’ll save yourself the later headaches when roles naturally emerge that conflict with some arbitrary earlier assignment. However, please don’t take this as a pass to assign yourself a title like “Chief Code Grinder”. I know you probably think it’s original and super cool, but I can assure you it’s neither.

Funny story I was judging a student start-up competition and the two guys walked in with the titles CEO and CFO. This threw me for a loop as there wasn’t even a CTO. Either the CEO was the tech guy or they had no one on board as the technical person. So I told them that I’d like them to switch their titles to co-founders as it might serve them better in the future. They told me that they actually had been using those titles until a week prior when they had met with another adviser who told them to switch them to CEO and CFO.

Well played Coinkite

I was checking out Canadian based (my wife’s hometown of Toronto) Bitcoin processor Coinkite.  They’ve got a claim on their FAQ that says:

  • Reliable support—we answer emails—go ahead and try.

So I thought I’d give it a shot.  I found a typo on their developer page, which seemed like a good thing to report.

Check out the emails.. total time to fix the page and respond to me.. under 2 minutes.

Coinkite you are my heroes for today.



Today I dwell in hyperbole…

Or do I?

From one of my Bitcoin slide decks

btc hyperbole



As some of you know I’m pretty much swamped in crypto-currency programming, research, and promotion these days.  I’m sponsoring first prize and am working with Brian and Sheradyn in planning and running the Madworks Crypto-Currency Hackaton in 10 days.

We’ve already got great response and that was before this article came out. I’m looking forward to a really fun/hectic day. A special thanks to Greenpoint Funds and Foley & Lardner for ponying up the cash for the other prizes and meals.  It’s good to see people outside the programmer community interested in the crypto$ scene.

I’ve been doing a number of things in preparing for the hackathon including, getting miners set up, saving off preloaded blockchain data, and a few other tasks to help participants get up and running quickly.

Hope to see everyone there…



Techstars Cloud 2013 Demo Day – back to San Antonio we go

After having a great time at last year’s Techstars Cloud it was time to return. It almost didn’t happen.

Last Wednesday Kent from Blue Point Investment Counsel and I headed out to catch our flight to San Antonio. Our flight was delayed out of Madison meaning we were going to miss our connecting flight from Dallas to San Antonio. I figured it was no big deal since they have flights every hour going into San Antonio. I was wrong. Apparently the prior day American had computer issues that caused all the flights on Wednesday to be overbooked.

Now on the way to the airport Kent was bragging about his Tripit app that had informed him of the flight problems and he was going on and on about how awesome it was. He had given it his gmail account and the thing scanned his email, found all his flights, and then informed him of any changes, gate updates etc. I told him he was nuts for letting them have his gmail account.

We stood at the American service counter for over 20 minutes but our service person couldn’t find us an alternate route to San Antonio. She even brought in someone else who knew all the “secret paths” through the American booking terminal. The best they could do was put us on a bus to Chicago and then fly us in to San Antonio late on another airline. At this point we were seriously considering canceling the trip. I try not to fall for the old “bus to O’hare” trick.

Now while all this is going on Kent has got his Tripit app out and is saying stuff like “Hey my App says there are 4 seats on flight blah blah blah .. did you check that one”? She’s responding with “No sir nothing on that one”. I’m giving Kent the evil eye during this whole process. “Really Kent”?

Well guess what. He finally says, “Hey I see 4 seats on this flight can you check that one”? Of course she finds 2 first-class seats for us to San Antonio. Kent+Annoying Tripit App : 1 , Me: 0. But at least we were off to San Antonio.

We got into San Antonio Wednesday night and the pre-presentation-day meetup was in the lobby bar of our hotel. We got to spend some time with the teams who were going to present the following day. In particular we hung out with the guys from Drifty, the Madison based company, who were presenting but weren’t seeking investment. Drifty makes a couple of cool products one of which I’ve used, JetStrap, for designing web UI. If you’re doing any work with Twitter Bootstrap (this might actually be a requirement for all accelerator companies) and you haven’t tried it, you’re missing out.

Thursday morning we headed over for the 12 scheduled presentations (only 11 actually presented but we’ll return to that). In usual Techstars fashion each company was introduced by their mentor followed by the CEO giving a short pitch.


Some were better than others but overall all the teams pitched well. I didn’t feel like I was watching an episode of Shark Tank since none of them ended with “Who wants to join me on this adventure”? I’ve included the program below with the list of companies this year.

techstars 2013 flyer

I found a number of the companies interesting but here’s some more information on some I spent more time with:

The first was Ziptask who is outsourcing project management for outsourcing. Got that? It’s actually a pretty cool idea. For anyone who’s tried to do an outsourced project on the web through websites like ODesk or FreeLancer it can be a nightmare to select a developer or manage them through the actual project.

Shawn Livermore, CEO, gave me an in depth demo of both the front and back end software they’ll be using to manage projects. I was genuinely impressed with both the functionality and user-interface. This has been a labor of love for Shawn and it shows in the software.  It includes portions to manage the bidding process, deliverables with due dates, milestones and a host of other things that anyone who’s managed software projects will immediately recognize.

I will probably run a project through them to see how it goes.  I talked to a number of people at the event who were thinking the same thing.  The only problem I see is if they enter a period of hyper-growth they’ll be having to hire lots of good project managers quickly.  This could be difficult as it’s been my experience that good project managers are tough to come by. Overall a great idea with some cool software behind it.

The second company I spent some time with was Threat Stack, a company offering a cloud solution for intrusion detection.  For those of you with servers that are connected to the internet you may or may not know what’s going on with your boxes depending on your level of laziness.  I’ve got a server racked over at our local data center 5Nines (whoa! Richard Branson.. wait.. nope just Todd) so after the pitches, I created an account on Threat Stack.  What’s impressive is that within 45 minutes I had the intrusion software installed on one of my servers and I was collecting data.

The interface is pretty straight forward as you can see below.  For those of you who are familiar already with the product Snorby this will look pretty familiar.  The Threat Stack team are the ones behind the open source project for visualizing data from products like Snort and Suricata.  The thing that they’ve got going for them with Threat Stack is convenience.  I’ve been meaning to set up something like Security Onion for some time but just haven’t gotten around to it.  After a few days of  it running I’ve seen literally thousands of attempts from China trying to log into my servers as root.

threat stack screen shot

My recommendation is to go check it out if you’re doing any kind of server management.  I probably won’t continue to use it after the demo as it’s a bit pricey for me ($200/month) but for people with actual servers of importance there’s plenty of value there.

The pitch by the company Conspire quickly captured my attention. You allow them access to your email and they then analyze your interactions with others.  They don’t look at the content of the email, just the headers, so they can see with whom you’re emailing, when, and how often.  This has a lot of cool applications from figuring out the shortest path through your network of connections to helping maintain relationships with people before they fall by the wayside.

One example CEO Alex Devkar provided was searching their network for someone you don’t know, but to which you need an introduction.  Their software will allow you to find the most relevant path to that person searching through your most contacted connections, your connections connections, and so on and so on.  The thing to remember is that you don’t have to build those connections it’s already been done with your email history.

I didn’t get a chance to meet with Alex at the show and I was concerned about handing over my email as well as some other privacy concerns so I reached out to him. After a 90 minute call I was sold. He was passionate and knowledgeable about the problem and their solution.  It’s always nice to have a first meeting with people who know their stuff.

After the show I was discussing their solution and there was some confusion with how they would compete with Linked-In and after further reflection here’s my answer to that.  Linked-In is the Rolodex of the web.  Instead of handing out cards we just connect on Linked-In.  The problem is that after time your Rolodex is overflowing and it takes conscious effort to keep track of who’s who within it.  With Conspire they do that analysis and work for you. These guys are definitely on to something.

The last company I want to talk about is DataRobot.

dr small

Above I mentioned that only 11 of the 12 presented at demo day.  DataRobot was the one that didn’t present, due to the fact they were not looking for any more investment.  I ended up talking to their CEO Jeremy Achin by accident. At the end of my conversation with Shawn from ZipTask he said, “Hey! You have to meet Jeremy he’s one of the smartest guys here”.  It didn’t take me long to figure out that Shawn was right.

DataRobot is all about predictive analytics.   Basically it’s using computer modelling to predict outcomes based on huge data sets.  This is used in insurance, banking, stock markets, bio-med and just about everything of importance in the world.  Jeremy and his team have been at this for a while including winning competitions on the site Kaggle, where it appears anybody who’s anybody in this field hangs out.

After my discussion with Jeremy I headed back to my hotel to check out Kaggle and the whole predictive modeling thing.  If you want to understand what  DataRobot is all about I recommend you take 5 minutes to watch this video with Jeremy Howard.  It does a really good job.

At the after party I again met up with Jeremy and his cofounder Tom Degodoy. What sets DataRobot apart is their plans for the company.  Most of the start-ups I’ve met with over the last couple years are always about the technology and when I ask for a business plan I’m met with a blank stare.  I spent a fair amount of time talking with them about company strategy, and things like pricing and customer acquisition.  You can tell that they’ve given a lot of thought not only to their technology but to how they’re going to monetize it.  There is no doubt in my mind that DataRobot is a company to keep an eye on.

So back to Kent and Tripit. Turns out on our way back to Madison our flight from Dallas to Madison got its gate changed 5 times. After the first gate change we ran into my buddy Duke from Garbage coming back from their concert in Mexico City. Here’s a picture of him and Kent looking nonchalant so I can send a picture to my wife and surprise her with the weird coincidence.


Every time the gate would change Kent would know about it first and inform us all. After an hour of plane maintenance and a few beers we finally got on board.

After we boarded a guy looking particularly 70′s rock musician sat in front of Kent. I happened to know that Steppenwolf was playing at Ho Chunk Casino in Baraboo, Saturday, so I leaned forward and asked him if he was with Steppenwolf.  Sure enough, he was the guitar player.  So Kent ended up sitting with Garbage in the seat behind him, Steppenwolf in the seat in front of him and I spent a good portion of the flight discussing 70′s rock.

If you want to know how I knew Steppenwolf was in town it was this text message to me and Pdub a month ago from Duke.


Phone companies still suck

So my wife has a phone number from when she lived in South Carolina.  This causes problems when people are expecting her to call her from Madison and instead they go “who the heck is calling me from South Carolina”?

So I set out to fix this with our wireless provider AT&T.  For those of you who use AT&T I bet you’re already cringing.

I started out by searching online and logging in to our wireless portal.  I learned that they will charge you to change phone numbers but if you have moved from one region to another that they will change your number to a local provider for free.   Ok sounds good.  But there was one issue I wanted to deal with.  My wife has had this number for at least 5 years and I knew it would take some time to notify all her contacts that she was changing phone numbers.  So I thought “Hey they must offer some 3 month forwarding service or something from an old line right?”  I thought wrong.

I was told that the minute that the number was switched she would no longer receive calls from her old number and that anyone trying to call her would get a disconnected message.  So this is my rant.  AT&T WHY THE HELL DO YOU NOT OFFER A SERVICE TO DEAL WITH THIS?  (Sorry for all caps.  I rarely use them for emphasis but this seems like an appropriate place).  I mean really, I would have been more than happy to pay for an additional forwarding service for a few months to make it simple.

Fine, OK, so how to get around this.  I told the representative what I was trying to do and we came up with a solution of getting a cheap secondary phone for a few months with a new Madison number.  When the phone arrived I would activate the new number, call AT&T, and switch the new number to her IPhone and put her old number on the cheapy phone.  That way she could start using the new number for outgoing calls yet still receive incoming calls on her old phone until she informed everyone of her new number.  Not a perfect solution but workable.

The phone arrived last night as we were headed out to dinner so I left it until this morning.  They sent me some crappy LG phone with a horrible UI which actually saved me.  I had to call in and activate the phone but every time I needed to type in a number on the key pad the LG phone had locked me out so I ended up with an actual human operator.  This would turn out to be lucky for me.  She told me I needed to activate the agreement for the phone and hooked me up with some automated recording which notified me I was signing up for a 2 year contract.  At this point I’m telling her “Whoa whoa whoa that wasn’t what I wanted”.

I explained what I was trying to do and how the original sales guy somehow decided to sucker me into a 2 year contract.  She explained to me that anytime you get a phone provided by them that there was a 2 year contract involved, and that if I didn’t want the contract I’d have to go buy a phone and then activate that phone on the new line.

So now I have to send back the phone on my dime to get a credit for this crappy phone.  I also have to head out to a local store to buy another phone so that I can then hook that up to the new line.  It really shouldn’t be this hard.


Not Ready for Prime Twine : A review of Super Mechanical’s Kickstarter hardware

A lack of an in-apartment washer and dryer can make life a bit of a drag.  Earlier this year I was wondering why our building’s washers/dryers don’t have internet monitoring.  I did some research and found a number of companies were offering internet monitored washers/dryers but they mainly seemed to be in use at college campuses.  Makes sense.

I brought it up to my  landlord and it turned out that the company that sold him the system in the building had them available and was willing to do an upgrade for him.  There was, however, the small problem of cost.  They wanted over $200 a machine to upgrade the “readers” as well as an ongoing monthly fee of around $100 for the service.  He and I concluded that people weren’t willing to pay $5 per load of laundry and that we’d all just have to keep using our microwave timers / iphones to keep track of our laundry.

It was around this time that I ran across a project on Kickstarter that I thought might help. I’m a big fan of Kickstarter and have funded a variety of things on there including Double Fine Adventure , Mail Pilot, and my personal favorite The Travoltas who I will be traveling to see in Houston this February.  But, the Kickstarter I’m talking about in this case is this one called Twine.

It looked interesting since it was a simple sensor system with built-in wireless.  It  included temperature and vibration sensors and since I had this washer/dryer thing stuck in my head I figured I should check it out.  The project had already been fully funded but I went ahead and pre-ordered one and then promptly forgot about it. That is until about a month ago when I got an email saying that my Twine was ready for shipping.  The email also said that things had turned out a little more expensive than they expected so the cost of the Twine with full sensor package was $199 instead of the $174 when I pre-ordered.

My Twine arrived within a couple days and I have to admit as far as packaging goes it was amazing.  The guys at SuperMechanical certainly know how to put together a cardboard package.

The Twine itself as well as all the add-on sensors are sleek looking and well made.

Open Box 1
Open Box 2

After pulling everything out of the box I was pretty excited to start playing with my new toy.  Unfortunately this is where the excitement ended and the disappointment began to set in.

I installed the batteries in the Twine and fired up my laptop.  A new wireless network appeared and after connecting my laptop to it, a browser popped up asking me to select my local network and password so it could jump on my wireless.  I had a moment of trepidation here as I’m not big on putting passwords on devices that I don’t have the source code to but I moved ahead.

After that I had to create an account on SuperMechanical so that my Twine could start sending data to their web site.  This was all a relatively painless process and within a few minutes I could see that the Twine knew what orientation it was sitting in and that it was sending the temperature of my apartment.  Immediately I noticed that the Twine thought my apartment was anywhere from 78-80 degrees which is pretty much about 10 degrees too high. Oh well no biggy, next I thought I’d check out the vibration sensor. “Currently not supported”.  Huh what?  That’s why I bought the thing.  Nope,  I was out of luck.

I moved on to play with the other sensors, moisture, magnetic switch, and break out.  They all worked as expected.

The Twine is easy to take apart and if you look at the main board it’s pretty clean with a 32 bit ARM based MCU, a Gainspan low power wifi module, and some Flash memory. The board has a Micro-USB port but that is only for power, not communication.  The 4 pin sensor adapter plug is wired with the following : 3.3V,GND, Input, 1 Wire.  The Input is the input voltage from a sensor and the 1 wire is used to identify that sensor.  Each of the sensor boards has a 64 bit Unique ID chip with 1 wire interface. I’m pretty sure the people who are planning to plug their sensors in using a Y cable are going to be in for a surprise.

Take it apart

After playing around with the website for a while what I really wanted was to get the data directly from the device to one of my local computers.  Why make me round trip from the Twine to the SuperMechanical web site back down to my local computer.  I understand the “hey you can see it anywhere on the internet this way”, but “hey maybe I only want to see it locally”.

I posted a message asking about the local network support in the software developer section but didn’t receive any kind of acknowledgement from the powers that be.  A couple of other Twine users responded they’d like to see the same support.  My guess is the SM guys are probably a bit overwhelmed right now.  (I sent them a copy of this to see if they had any comments back on December 19th and never heard a word.)

At one point I considered throwing a proxy in the middle to see what the device is actually sending to the SM servers, but I decided I wasn’t going to invest that much time in it since I was probably going to shelve it for now anyway.

The Twine sends its updates to the server at a couple of different rates.  One is about every 45 seconds, the other is in a continuous fast update mode.  It will switch back to the slower mode on its own if you don’t click a button on the website stopping it.  This was another reason I wanted to access it locally instead of the whole crazy round trip it was doing to be able to monitor it from 3 feet away.

Although the actual hardware of the Twine seems to be well thought out and well designed (maybe even over-designed a bit) the software side is quite lacking.   As it stands today if you want to sense temperature and a single input of one other type, send that info to a server somewhere, then have it email you, tweet, or post to an http url using very simple rules then this is the device for you.  If your use doesn’t match that exacting criteria then $200 seems a little high when you can buy an Arduino or a Raspberry PI for less than 1/5th the price.

I hope they open source the system or at least open up the interface to the local network. This would allow alternate uses for the hardware to grow. Without that I think the Twine is in for a rough road. I wish them luck as they seem like good guys who really know how to do physical product design.  Right now there’s better value for your money if you have any DIY skills at all.

Using WordPress for a Simple News Web Widget.

I often find myself needing a simple blog type news feed for the various sites I’m creating.  In the past when I was still doing my primary development on .NET I rolled my own solution using a SQL database to hold the entries.  At first I’d just put the data into the database directly but of course over time I wanted to be able to edit and add new entries from a web interface.  So I found myself incorporating TinyMCE and pretty soon I was writing a whole management interface just for editing news posts.  I knew it was time to stop myself before completely falling prey to NIH syndrome.

Lately I’ve been doing a fair amount of small projects using WordPress and it seemed that I could just use its CMS to hold my news posts. I’d now have a repository for my posts as well as the ability to edit them.
In order to use the post data in my other sites I wanted to export my posts in JSON format. I’ve written a fair amount of JSON exporting code server side so I knew this wouldn’t be a problem. Luckily I took a look in the WordPress plugin repository and there’s already a pretty nice plugin that dumps any post in JSON.
If you want to do this you first need to download and install the latest version of WordPress.  This should take you all of 5 minutes unless you’ve never done it before in which case it might take you 30 minutes if you need to install LAMP or WAMP if you’re on Windows.  You can also do it in about 5 minutes on an Amazon EC2 micro instance by following this excellent guide.
You will also need to download the JSON plugin and copy it to the plugins directory.  You can now create new posts in WordPress and pull the data out in JSON.
I’m using this for my Madtown yoga website.  If you look at the site you will see a list of  news posts on the right that are fed by the WordPress blog.
The simple blog interface is here, where you can see a series of posts in the default wp skin. If you look here you can see the exact same posts rendered out as JSON data. Since I’m only using the blog as a data repository I didn’t bother messing with the WordPress UI.

If you look at the HTML for the news section on the right you will only see two lines of relevant code.

<script src="/Scripts/wpnews.js" type="text/javascript"></script>
<div id="news-widget-container"></div>

I’ve implemented this as a web widget. A web widget is a way to allow people to very simply place some kind of functionality on a web page.  A container div with a specific ID is placed on the page and then that div is filled in on the client side by the corresponding JavaScript. If you want a good tutorial on web widgets you can check out this article.  It’s the article I used to originally craft this news widget a few months ago.

In this case the JavaScript checks to see if JQuery v1.6.4 is currently loaded on the page and if not it loads it. It then reads the data from my WordPress site as JSON into an array and then adds the posts as a series of divs to the HTML DOM . The actual formatting of the data is handled by a separate CSS file.

If you want to have some fun make a simple text document on your desktop that looks like this and save it as test.html.

<script src="" type="text/javascript"></script>
<div id="news-widget-container"></div>

If you then open this test.html file in your browser you should see an unformatted list of my news posts in the document you just created. If you add in the line

<link href="" rel="stylesheet" type="text/css" />

you will then see it with my formatting applied to the divs.  What’s nice about widgets, besides the simplicity of using them, is the ability to their look on your page.  In this case add simple formatting to the classes FrontPageNewsTitle and FrontPageNewsContent and you can make my news feed look however you like on your web page.

If you take a look at the source code for wpnews.js it should be pretty straight forward to understand what I’m doing.  The majority of the code at the bottom is simply used to get JQuery loaded up if it isn’t already used on the page.

A special note to be aware of when pulling data from a web site in JSON  format. Often when pulling data into a web widget the domain which has the data may be different from the domain from which we load our page.  The issue here is that JavaScript is not allowed to load JSON data cross domain. The way that you solve this problem is by using a format called JSONP. The only difference in JSON and JSONP is that in the latter case the data is wrapped in a function and then returned as an object when that function is called.  Since code can be loaded and executed cross domain you just got around the problem!

Raspberry Pi a la Node – Running nodejs on the Raspberry Pi

My local hackerspace Sector67 held a 24 hour build fest this weekend. Late last week in preparation for the event a few of us purchased some Raspberry Pi‘s.  If you’re not familiar with the Pi, it’s a credit card sized ARM based computer with a USB port, Ethernet port, GPU,  HDMI and Composite video out that runs Linux for about $35.  You can read more about it at the link above but the “overall real world performance is something like a 300MHZ Pentium 2, only with much, much swankier graphics.”

I bought 2 with the plan to install nodejs on them.  I figured it’d be a super cheap way to create a web based controller people could use in other projects.  I’m happy to say it only took me about 3 days to make this a reality.  However it was pretty much 3 days of frustration that I’d like to keep the rest of you from having to experience so here’s the scoop.

I bought the bare bones Raspberry Pi for $35.  When you buy the bare bones card you’re going to need some more hardware to make it run. Here’s a quick list:

1) Class 4 SD card from 4 gigs to 32 gigs in size.  The Pi uses this for its boot device.
2) A 5v 700mA micro usb power supply.
3) A powered usb hub.
4) USB Mouse/keyboard.
5) A monitor with HDMI input or an HDMI to DVI converter if your monitor is DVI only.
6) An SD card reader/writer to image the SD card with the linux boot image.

My Pi’s arrived in the middle of the week along with the 2 32g SD cards I ordered off Amazon.  I rooted around my junk hardware closet and dug out an extra USB keyboard/mouse and an old Blackberry phone power supply that was 5V/650mA micro USB.  The one thing I realized I was missing was an SD card reader/writer.  I made a quick search on the internet and found that my local Target claimed to have a $10 cheap-o reader in stock.  Of course when I got there they didn’t have it but they did have a $20 Belkin universal media writer as well as a couple of powered USB hubs on closeout for $5 each.  So I grabbed them and headed over to Sector67 to get things up and running.

The first order of business was to get the linux images on to my SD cards.  I decided to do this from my Windows based laptop so I downloaded the latest Debian image from the Raspberry Pi download page.  The actual release I picked up was the September 18th Raspbian “Wheezy” image.  After downloading and unzipping the image I was left with an .img file I needed to get onto the SD cards.  In order to facilitate this I needed a piece of imaging software called Image Writer for Windows.  After downloading and installing this I was able to quickly image both my SD cards with the Raspbian .img file.

Here’s where I ran into my first problem.  I have a Dell monitor with HDMI input but I didn’t have it at Sector.  I had an HDMI cable with an HDMI to DVI converter but when I plugged it in all I got was a blank screen.  I tried it with both Pi’s and both of them did the same thing.  Ok fine, the Pi’s also have a composite video output so I tried that next.  Same problem.  For some reason when I plugged both the composite and HDMI/DVI in at the same time I suddenly got video out through the HDMI/DVI combo.  If I unplugged the composite or the HDMI/DVI neither output would work.  If I plugged them both back in then the HDMI/DVI would work.  At least it allowed me to get working on the thing.  Later on I switched to my HDMI monitor with a straight HDMI connection and everything was fine.  However, I did experience some weirdness in that occasionally I would get a weird “puttering” sound through the HDMI audio.  This happened on both Pi’s so I’m not quite sure what’s up with that.

I was finally up and running with a basic Linux image on my Pi.  Now it was time to try and get nodejs up and running.  I did some looking around the web and the best article I could find was this one from July.  It looked highly promising.  Here’s where I ran into my next gotcha.  While editing with VI I found that I couldn’t enter a # character on my keyboard.  I got around it by cutting and pasting a # from somewhere else in the document but a few minutes later when I needed to use the | character on the command line I knew I was going to have to fix the problem.

The image that is on the Raspberry Pi website was obviously generated by someone in the UK as the keyboard settings are set for a UK keyboard and this is what was causing me grief.  To fix this issue you first have to type in the following:

$ sudo dpkg-reconfigure keyboard-configuration
You’ll have to go through a few dialog boxes to select your keyboard and the region of US.  At one point you have to select Other to get to the other regions besides the UK ones.After finishing you can reboot the system or just do the following:

$ sudo invoke-rc.d keyboard-setup start
Moving on with a functional keyboard I followed the instructions in the article and downloaded the 6.15 node distribution and attempted to get it compiled on the Pi.  So here’s where things started to suck.  The first thing I found was it took anywhere from a 90 minutes to 2 hours to compile the v8 library on the Raspberry Pi.  I immediately, and by immediately I mean 90 minutes later, I ran into a problem when trying to link the library.  After searching around the net for a solution I found what I thought was my issue and recompiled.  90 minutes later I ran into another issue.  This went on for some time with trying different compiler options to get things working.  After 4 hours or so of frustration I decided to head home and work on it there.  Here came my next problem.  In my haste to head home I didn’t bother issuing a

$ sudo halt
so when I got home my SD disc image was corrupted and wouldn’t boot correctly.  Awesome.  So after a re-imaging of the card, upping the file system to use all the card, (I didn’t mention this earlier but you need to do this to use all the space on the card and it takes like 30 minutes), and fixing the keyboard thing I was ready to go back at it.Now for another woe.  The Pi has an option to overclock/overvolt it for better performance.  In the past this wasn’t recommended and could void your warranty.  As of the latest release there is a turbo mode that lets you crank up your 700MHz processor in a variety of steps to 1GHz.  However they note that you need to “test stability” as it’s dependent on your Pi and your power supply.

At this point I was getting pretty frustrated with compile times so I cranked that baby up to 1GHz, re-downloaded node and got to work compiling.  Unfortunately this ended really quick as for some reason the node compile died almost immediately.  I took a look at the file that wouldn’t compile and found that the text file was corrupted.  Turns out that my “stability” was not good.  I assumed that my power supply was not up to the job, so I cranked my CPU back down to 700 MHz and reformatted again to get rid of my corrupt files.

I continued to get nowhere with building node and spent a lot of time searching for other ways to get it to build.  I probably tried 3 different articles that people had posted as well as one binary distribution that immediately died with a segmentation fault.  Finally on Saturday morning I came across this gist.  This is the one that made it all happen.  Down near the bottom is the post from a month ago with the script that worked for them as well as me:

export NODE_VER=0.8.7
cd ~
rm -rf ~/work/node-build/node-$NODE_VER;mkdir -p ~/work/node-build && cd ~/work/node-build
rm -rf ~/opt/node-$NODE_VER;mkdir -p ~/opt/node-$NODE_VER

curl -O

curl$NODE_VER/node-v$NODE_VER.tar.gz | tar xz
cd node-v$NODE_VER
git apply --stat --apply ../arm-patches-2.patch

./configure --shared-openssl --without-snapshot --prefix=~/opt/node-$NODE_VER

date;time make CFLAGS+=-O2 CXXFLAGS+=-O2 install;date

If you copy this into a shell script and run it you should get an actual build of node 0.8.7 and npm that works.  After the build I was quickly able to install express and have a web site up and running on the Raspberry Pi.  I was also able to install the serialport library for nodejs and talk to an Arduino board over the USB connection (a topic for another day).

I was so excited to have node running that I even went ahead and compiled redis for the Pi as well.
I just pulled down the distribution off this link  and did a

$ make -j 2
When it’s done it will tell you to run “make test” which I did as well just for grins.  This takes quite a while to run but it passed everything for me.I’m really looking forward to messing around with this more.  A super cheap, super tiny embedded linux system that is powered by nodejs has got me fired up to put all sorts of stuff online.  A HUGE HUGE tip of the hat to Adam Malcontenti-Wilson for the gist that got me up and running.

Finally here’s the obligatory screenshot of my Raspberry Pi running the default Express installation on nodejs.