Midnight Cheese

Exploring Design, Web Development and Art

SpaceX-4 Lanuch & NASA Social: A Homecoming

| Comments

I can’t tell you how excited I am to have been selected to attend a NASA Social event featuring the fourth launch of SpaceX’s Dragon capsule to resupply ISS. After seeing what my friend Ernie was able to experience during his NASA Social trip, I knew I had to do one, and here’s my chance.

What makes this trip to the Space Coast extra special for me is the homecoming feeling this trip has taken on over the last few weeks.

Growing up in South Florida, the space program was always prominently featured in school. In 6th grade our class had a unique opportunity to drive up to Titusville and experience a one day version of Space Camp. We had a lot of fun on that trip. During the bus tour portion I specifically remember passing a launchpad that had a Shuttle positioned and ready to launch within the next few days.

Prior to that 6th grade trip, in the mid to late 80s, my family would often visit my grandmother who lived in Titusville. Every once in a while we’d sit in her house and notice the windows begin to rattle just a bit. We all knew that meant NASA was launching a Shuttle that day.

We have another family member that worked for a NASA contractor doing quality control for the engines built for the Shuttles. We celebrated his 80th birthday this summer.

Later, in the 1990’s when living in Miami, I vividly remember my family driving out of town to see the Shuttle travel across the sky during a night launch. It was amazing we could see a launch from that distance.

NASA and space travel was always an intriguing subject growing up. Getting a chance to go back to Titusville almost 20 years after my last visit for an even greater in-depth look at the space program is more than I could ever ask for. I can’t wait!

Listening and Decoding Data Packets From the International Space Station

| Comments

On June 19th I set up my Kenwood HT paired with a magnet mount antenna in the driveway, tuned the radio to 145.825 and listened to data packets from the International Space Station magically flow through the radio’s speaker as ISS flew overhead.

During a period of three minutes of time, 22 data packets were recorded. I used my iPhone placed next to the handheld’s speaker as a recording device.

The audio can be listened to here, and the decoded packet information is displayed below.

fm K9CMI to APX205 via RS0ISS*
=4007.30N\08817.95WhPHG5130_73 from Paul in Champaign, IL_M
fm K4IPH to CQ via RS0ISS*
=4035.60N/07934.34W-Bob in Vandergrift, PAM
fm KB2M-2 to 3YTU0W via RS0ISS*
`g](l `/`Winter QTH/SatGate _%M
fm W2THC to APRS via RS0ISS*
=4006.06N\07409.14W(**de FN20wc**
fm K9CMI to APX205 via RS0ISS*
=4007.30N\08817.95WhPHG5130_73 from Paul in Champaign, IL_M
fm N2RRJ to CQ via RS0ISS*
=4000.57N/07408.13W-Greetings from N2RRJM
fm W9QO to STPX1V via RS0ISS* SGATE WIDE2-2
‘oIPl -/]M
fm W2THC to APRS via RS0ISS*
=4006.06N\07409.14W(**de FN20wc**
fm W5XV to DADV9S via RS0ISS*
‘vQvl -/]=M
fm W2THC to APRS via RS0ISS*
=4006.06N\07409.14W(**de FN20wc**
fm KB2M-2 to 3YTU0W via RS0ISS*
`g](l `/`Winter QTH/SatGate _%M
fm KC4MCQ to CQ via RS0ISS* SGATE WIDE2-2
Salt Springs FloridaM
fm W2THC to APRS via RS0ISS*
:W5XV :How’s it goin’, eh?
fm W2THC to APRS via RS0ISS*
=4006.06N\07409.14W(**de FN20wc**
fm AA9LC to W2THC via RS0ISS*
de Grant AA9LC
fm WU2V-1 to APRS via RS0ISS*
:HEARDlast:W5UL-15,KB2M-2,AA9LC,W2THC,WU2V-6,KB1CHU,N2QKV
fm N8ROA to CQ via RS0ISS*
=3948.45N/08202.30W-DARRIN — ROSEVILLE, OHIO {UISS52}
fm KC4MCQ to CQ via RS0ISS* SGATE WIDE2-2
Salt Springs FloridaM
fm KC4MCQ to CQ via RS0ISS* SGATE WIDE2-2
=2920.58N/08144.33W`Salt Springs, FloridaM
fm KB2M-2 to 3YTU0W via RS0ISS*
`g](l `/`Winter QTH/SatGate _%M
fm W2THC to APRS via RS0ISS*
=4006.06N\07409.14W(**de FN20wc**
fm WU2V-1 to APRS via RS0ISS*
North Carolina Urban Search and Rescue Field Communications

Building the Softrock Ensemble RXTX

| Comments

Looking for a cheap HF alternative, I picked up one of the Softrock Ensemble RXTX kits and spent the last several weeks slowly putting the kit together. The Ensemble is a software defined radio. In summary, the hardware provides the signal, leaving the computer software to do the heavy lifting. That includes both tuning and processing of received signals.

Softrock Ensemble RXTX Softrock Ensemble RXTX

I had high expectations for this little board and it delivers fairly well. The receive functionality is fantastic, even though setting up the software is a nightmare. Transmitting is still a work in progress. However, I think I’m close.

The beauty of SDR
Software defined radio is all about the visuals. In essence, SDR allows you to “see” a wide band of spectrum all at once. If you think about the FM radio band that your car stereo listens in on, the SDR equivalent shows you all the FM transmissions/signals at once. Each transmission is represented by a column of lovely reds and greens flowing up the screen. To listen in on a single transmission, simply click where you want to listen. No more scrolling or scanning until you find something that doesn’t sound like static.

HDSDR HDSDR displaying 80 meters.

In the ham radio world, instead of broadcast stations, we’re looking at individuals transmitting conversations through voice, morse code, and digital modes. The same principle applies. Click on a column to listen in.

Building the kit
This was the most complex solder project I’ve taken on and I have the errors to show for it. Everything went fairly smoothly until I soldered in a transformer, forgetting to first strip the coating off the wires. In removing the transformer, I damaged the board enough where one of the metal rings lining one of the solder points was accidentally ripped out. I had to use a piece of scrap to hard-solder two points together. On the up side, things still tested out and functioned properly.

At the time, I thought the build process would be the most difficult part of this project. Little did I know, the software setup for this board would be one of the most difficult software setups I’ve ever encountered.

Being a UX guy it typically doesn’t take me long to figure things out when it comes to software. However, words can not describe the frustration experienced trying to get the SDR software set up to use this kit.

HDSDR software
When HDSDR works, it’s a beautiful thing. The software really is easy to navigate. Tuning is straight-forward and the visuals are amazing.

As awesome as the software is, there’s no good step-by-step documentation to walk the user through the process. I understand there are a lot of variables in these setups including various sound cards and even the way the boards are built. That may be one reason for a general lack of documentation. The other issue is, when setting things up, I tried so many different combinations of this and that, I can’t remember what I did to get the thing working. That makes writing documentation difficult and I wish I had taken notes along the way for the benefit of others.

Receiving
Once the software is in good shape, let the receiving begin. I built my kit for the 20, 30, and 40 meter bands, but I’m still able to receive from 80 meters to about 15MHz very well.

Speaking about sound cards for a moment, from what I can gather, a really good sound card is what’s needed to make these kits really work well. The sound card in my desktop machine does a great job. On the other hand, plugging the Ensemble into my laptop, I experience mirrored instances of signals. Unfortunately, playing with HDSDR’s image reduction settings yields no change in my experience.

Transmitting
As difficult as the setup was for receiving, I still haven’t figured out the proper configuration for transmitting. The unit will transmit. I’ve been able to pick up my own transmissions on another receiver. The issue I’m having is a double transmission of the signal. In the screens below, you can see what that looks like when transmitting digital modes. Could this simply be due to the fact that my reciever is so close in proximity to my transmitter?

JT65 double signal transmitting with Softrock Double signal transmitting JT65 with the Softrock.

PSK31 double signal transmitting with Softrock Double signal transmitting PSK31 with the Softrock.

HDSDR showing double transmit signal HDSDR displaying the double signal when transmitting.

I’ve posted to the Softrock group and read all there is to read, but I’m pretty well stuck at this point. I don’t have the knowledge to know if it’s hardware related or software related.

Until I figure that out, my days of transmitting are off in the not so distant future.

In the end, this has been a fun project and I use the kit regularly. It’s a great learning tool. If you’re on the fence about putting one of these kits together, I’d highly recommend it. Be prepared, however. A time consuming battle awaits when it comes time to interface with the computer. Keep at it, it will work eventually.

Raspberry Pi + Node.js + Socket.IO + Twitter Streaming API = Internet Bliss

| Comments

I’ve had an idea floating around for to build a tiny web app that does nothing but display new Tweets as they roll in from my Twitter User Stream. I envisioned displaying the resulting feed on a spare laptop or an old smartphone 24/7. Node.js felt like it might be a good solution to make that happen. After reading through the Node docs and a quick search on Stackoverflow, I found a great example of making this project happen with both Node.js and Socket.IO.

Here’s a look at my version of this app which also includes a bit of CSS to make things look better in the browser.

Node.js Twitter stream

Check out the end result running on my Raspberry Pi.

I won’t go into installing Node.js as it’s pretty straight-forward from their site. You will need to install a few extra packages, but that’s easy enough:

1
  npm install express socket.io twit

The application code is shown below, but also hosted on Github for easy download. We’ll start with the server (server.js) where the Twitter stream is consumed and processed. In this version I’m using the Twitter Firehose and limiting the Tweets that come back based on a keyword. In this case, any Tweet that contains the word ‘Nashville’ will be displayed in the browser.

server.js
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
var express = require('express')
  , app = express()
  , http = require('http')
  , server = http.createServer(app)
  , Twit = require('twit')
  , io = require('socket.io').listen(server);
server.listen(4040);

// routing
// Tell Node to load node-twitter-stream.html when the browser requests /
app.get('/', function (req, res) {
  res.sendFile(__dirname + '/node-twitter-stream.html');
});

// Tell Node to serve the CSS file when requested
app.get('/node-twitter-stream.css', function (req, res) {
  res.sendFile(__dirname + '/node-twitter-stream.css');
});

// When processeing the Twitter firehose, only show Tweets with this keyword
var watchList = ['nashville'];

var T = new Twit({
consumer_key:             'your key here'
  , consumer_secret:      'your secret here'
  , access_token:         'your token here'
  , access_token_secret:  'your token here'
});

io.sockets.on('connection', function (socket) {
  var stream = T.stream('statuses/filter', { track: watchList })
  //var stream = T.stream('statuses/sample') // Firehose (sampling of all Tweets)
  //var stream = T.stream('user') // Your user stream

  // When a Tweet is recieved:
  stream.on('tweet', function (tweet) {
    // Makes a link the Tweet clickable
    var turl = tweet.text.match( /(http|https|ftp):\/\/[^\s]*/i )
    if ( turl != null ) {
      turl = tweet.text.replace( turl[0], '<a href="'+turl[0]+'" target="new">'+turl[0]+'</a>' );
    } else {
      turl = tweet.text;
    }
    var mediaUrl;
    // Does the Tweet have an image attached?
    if ( tweet.entities['media'] ) {
      if ( tweet.entities['media'][0].type == "photo" ) {
        mediaUrl = tweet.entities['media'][0].media_url;
      } else {
        mediaUrl = null;
      }
    }
    // Send the Tweet to the browser
    io.sockets.emit('stream',turl, tweet.user.screen_name, tweet.user.profile_image_url, mediaUrl);
  });
});

The HTML file is static except for a small piece of JQuery that fades the old Tweet out and fades the new Tweet in. If a Tweet has a photo attached to it, the photo is added as a background image (node-twitter-stream.html).

node-twitter-stream.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
<!DOCTYPE html>
  <head>
    <title>TwitterNode</title>
    <meta name="apple-mobile-web-app-capable" content="yes">
    <meta name="apple-mobile-web-app-status-bar-style" content="black">
    <link href='http://fonts.googleapis.com/css?family=Vollkorn' rel='stylesheet' type='text/css'>
    <link rel="stylesheet" href="/node-twitter-stream.css">
    <script src="/socket.io/socket.io.js"></script>
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.js"></script>
    <script>
      var socket = io.connect('http://localhost:4040');
      socket.on('stream', function(tweet, user, avatar, media) {
        $( "#tweetd" ).fadeOut( function(){
          $( "#bg-image" ).css( "background-image","");
          $( "#tweetd" ).empty();
          $('#tweetd').prepend('<div class="image"></div><div><img src="'+avatar+'" width="48" height="48"><a href="http://twitter.com/'+user+'" target="_blank">@'+user+'</a> '+tweet+'</div>');
          if ( media ) {
            $( "#bg-image" ).css( "background-image","url("+media+")");
          }
          $( "#tweetd" ).fadeIn();
        })
      });
    </script>
  </head>
  <body>
    <div id="tweetd">Waiting for Tweets...</div>
    <div id="bg-image"></div>
  </body>
</html>

Here is the CSS (node-twitter-stream.css) that styles the application. There’s a small media query used here for better display on smartphones.

node-twitter-stream.css
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
body {
  background-color:#000;
  color:#ddd;
  font-family:'Vollkorn', serif;
  font-size:4em;
  letter-spacing:-0.04em;
  margin:0;
  padding:50px 75px 0 75px;
}

a {
  color:#ff5842;
}

img {
  float:left;
  position:absolute;
  right:0;
  bottom:0;
}

div {
  word-wrap:break-word;
}

#bg-image {
  background-size:cover;
  background-position:center center;
  opacity:.35;
  position:absolute;
  top:0;
  right:0;
  bottom:0;
  left:0;
  z-index:-10;
}

@media only screen and (device-width: 320px) {
  /* Style adjustments for viewports 520px and lower */
  body { font-size:6em; }
}

@media only screen and (device-width: 320px) and (-webkit-min-device-pixel-ratio: 2) {
  /* Style adjustments for viewports 520px and lower */
  body { font-size:6em; }
}

With Node installed locally, you can now run this app and take a look in your browser.

Running the application
1
  node server.js

I wanted to access the app from anywhere and running from a Raspberry Pi over the internet seemed like a good option. There’s a great write-up on Matt’s Blog that explains how to get Node running as a service. In other words, your Node app can run behind the scenes even after rebooting your Raspberry Pi.

That’s it! You can now access your Node app from anywhere on any internet connected device. One item to watch out for, however, is the IP address you specify in the HTML file.

Mind your IP address
1
var socket = io.connect('http://localhost:4040');

For internet users, you’ll need to specify your external IP address. For clients on your local network, the Pi’s local IP address will be required. I ended up using a URL variable that would determine which IP address to use.

Building an RSS Reader With Meteor

| Comments

Meteor is a nifty Javascript based platform that features live page updates as data is changed across browsers and users. With the demise of Google Reader, I thought an RSS reader might be a good application to put Meteor through its paces. Turns out Meteor does a fantastic job and is a lot of fun to work with.

I named my RSS reader Ocular. (Feel free to kick the tires.) With my design background and reading habits in mind, Ocular behaves differently from RSS readers most of us are familiar with. RSS is an amazing tool and a fantastic way to deliver information. The only down-side is the lack of visual design accompanied with most feeds. Ocular solves this by displaying the actual article associated with that feed item.

Ocular RSS Reader

By showing the actual article, the content’s original design is preserved, allowing the user to read content as the author intended. More added bonuses include the ability to see and interact with comments, and if you’re generating revenue through advertising, those ads will also be visible to the user.

The other major difference between Ocular and traditional RSS readers is the way Ocular handles new feed content. I’m a very casual reader when it comes to RSS. I don’t read every single feed item that is pushed out by major content publishers like Engadget or the NY Times. With that in mind, Ocular only checks for feed updates when you’re running the app. In my case, I keep Ocular open in a separate tab during the day while I’m working. At night, Ocular is closed and not pulling in updates. Turns out, this works out great because I’m still not missing any content. Sites that post less frequently still get picked up because Ocular is pulling in feed items based on the date of the last item that was pulled in. If one of my favorite sites publishes a few times overnight, those posts will be pulled in in the morning. On the opposite end of the spectrum, if a site like The Verge publishes 30 items overnight, it’s possible I might miss a few of the older feed items when I fire up Ocular in the morning. But, I’m not concerned about that because I don’t want to read that much content to begin with.

Aside from those differences, Ocular has features you would expect in an RSS reader: Feed lists, unread counts, the ability to mark favorites, etc.

Ocular RSS feed list

Meteor technical details
Here are some methods specific to Meteor that I’ve found interesting when building this project. I’m somewhat of a novice when it comes to Javascript, so please, feel free to suggest improvements.

Pulling in RSS feeds
Probably my favorite functionality in this app is all the feed updates happening in the browser using the Google Feed API.

Here’s a shortened example:


.ajax({
	type: "GET",
	url: "http://ajax.googleapis.com/ajax/services/feed/load?v=1.0&num=10&q="+url,
	dataType: "jsonp",
	success: function( data ) {
		// If article published date is newer than feed's last published date, add the article to the DB.
		if ( Date.parse( articles[j]['publishedDate'] ) > lastPublished ) {
			Meteor.call( 'addArticle', {
			  feedId: feedId,
			  title: articles[j]['title'],
			  url: articles[j]['link'],
			  publishedDate: articles[j]['publishedDate']
			}, function( error, result ) {
			  Meteor.call( 'updateReadCount', feedId, 1);
			});
		}
	}
});

If RSS articles are newer than the last saved RSS articles, we tell Meteor to save that article


Meteor.call( 'addArticle' );

and update the unread count


Meteor.call( 'updateReadCount', feedId, 1);

In our model file, those calls look like this:


Meteor.methods({
	addArticle: function( options ) {
    options = options || {};
    pubDate = Date.parse( options.publishedDate );
    return Articles.insert({
        owner: Meteor.userId(),
        feedId: options.feedId,
        title: options.title,
        url: options.url,
        publishedDate: pubDate,
        read: false,
        favorite: false,
        share: false
    });
  },
  updateReadCount: function( feedId, num ) {
    return Feeds.update( feedId, { 
      $inc: { unreadCount: num } 
    });
  }
});

The beauty of Meteor is, as soon as those DB entries are updated, the template updates the list of articles and the unread count all without a page refresh. It’s really exciting when you see that happen for the first time.

Displaying articles
This is a simple process, but something I had never done before. Using an iframe to house the article, when a user clicks on the feed item they’d like to read, the iframe is populated with the source URL from the feed article and the article is marked as read with a markRead method.


// client.js
Template.articleList.events({
    'click .article': function( event ) {
		Meteor.call( 'markRead', this._id );
	}
});

// model.js
markRead: function( articleId ) {
	return Articles.update( articleId, { 
  		$set: { read: true } 
	});
}

User accounts
A nice add-on tool is the accounts-ui package that makes dealing with user accounts quick and easy. In the template, adding the login/account UI is as simple as adding a single line:


<div class="header">
	{{loginButtons}}
</div>

Then in the client file we can do things like check to see if the user is logged in and do actions based on that state:


// If user is logged in, show a div.
if ( Meteor.user() ) {
	$( "div" ).show();
}

Pulling in data based on the logged in user looks like this in the client file:


var feeds = Feeds.find({$or: [{owner: Meteor.userId()}]}, {sort: {"title": 1}}).fetch();

That pulls in a list of RSS feeds associated with that user and sorts them alphabetically by title.

Large data
The one drawback facing the current iteration of Ocular is sluggishness around updating large sets of data. If I have an unread count pushing 1,000+ items and I mark all items as read, Meteor takes several seconds to make those changes to each record and apply a “read” class to each html element in the template. I don’t know if the hang-up is on the DB side or the template markup side. I’m sure there’s a better way to optimize my approach, but that’s something I’m still learning.

A great experience
All-in-all Meteor has been a lot of fun. I wish I was doing more to take advantage of the power of live page updates. Something like a list of articles marked as favorites across all users in real time could be fun.

For someone somewhat new to the deep depths of Javascript, I found Meteor to be relatively smooth sailing. Documentation is well done and it was rare that I encountered any major roadblocks. If you’re interested in Meteor, definitely give it a try. I’m already thinking about what I can build next.

Amateur Radio

| Comments

Because I’m not already leading a completely geek-filled lifestyle between my day job and my time on Jawgrind discussing the merits of original Star Trek episodes, I decided to seal the deal and get my ham radio license.

The process of getting my license kind of happened by accident. My grandfather was a ham (K9PSA) and I always remember seeing his radio setup as a kid, but I never really put together what it was all about. Last year I accidentally stumbled upon a form of data transmission popular in the ham world that allows people to transmit text across great distances with an extreme minimum of transmission power.

PSK31 Screenshot Screenshot of PSK31. Notice the “waterfall” of incoming signals. Transmissions are appearing from as far away as Cuba.

The idea of transmitting messages across states and countries sans-internet is interesting, but the visual aspect of PSK31 that allows you to literally see the data flowing through a block of spectrum is intriguing. So I had to try it myself.

Over the last several months I’ve been using a little RadioShack shortwave receiver to view the transmissions, and now that I finally have my license, I hope to be transmitting soon.

Radioshack DX-402 Reciever Radio Shack DX-402 Reciever

I passed both my Technician and General tests in March and received KK4HSX as my call sign. Now I’ve begun to slowly put a rig together. Recently I picked up an Alpha Delta antenna and mounted it to the highest point in the attic. With a length close to 40 feet, it fit perfectly stretching from one end of the attic to the other.

While saving up some dough to get a decent radio for that antenna, I’ve been playing around a bit on the local repeaters with a little Kenwood TH-K20A FM transceiver. I even enlisted my Dad’s help in building a larger (j-pole) antenna to plug into that little handheld. Still trying to find some time to properly test that thing out.

J-pole antenna J-pole antenna

That just about sums things up for now. I’m sure many more amateur radio related posts will follow this one.

Republic Wireless Review

| Comments

Now that I have a month’s worth of time with Republic Wireless under my belt, I thought this would be a good point to write up a review of those first 30 days.

Republic Wireless Logo

Republic Wireless is geared toward users who spend most of their time in an environment with persistent Wi-Fi. When in range of Wi-Fi access all phone calls are placed and received over Wi-Fi. When outside of Wi-Fi range Sprint is the carrier that all calls and data are routed through.

With most of a user’s calling minutes and data flowing over Wi-Fi, Republic Wireless can charge a low monthly fee of just $20 for unlimited everything.

Android OS

With access to Wi-Fi at home and work, this works out well. Even prior to Republic Wireless, my data usage over the cell network never made it over half a GB per month. So far so good! Let’s get into some detail.

First Impressions
While this is mostly UPS’s fault, my box arrived mangled and open. Luckily nothing had fallen out of the box in transit. (As far as I know.) I was looking forward to Republic Wireless stickers that other customers had received but was disappointed not to find any in the box. Maybe they fell out along the way.

Luckily the phone itself was unscathed. I was able to open everything up, charge the phone, and make calls right away.

Busted box UPS did a number on this package.

The phone: Motorola DEFY
The DEFY is by no means a high-end Android device, but compared to the iPhone 3G that I was using previously, the DEFY is lightening fast. Switching between apps produces no lag. Waiting (sometimes minutes) for the keyboard to respond after a keypress on the iPhone is a thing of the past.

Motorola DEFY The Motorola DEFY.

Coming from the iPhone, the DEFY feels cheap and fragile, even though Motorola touts it as a rugged device that is water resistant and able to absorb shocks.

The best hardware feature of the DEFY compared to the 3G is the high-res screen. Even switching back and forth between the DEFY and the iPad 2, the iPad begins to look low-res.

High res screen Notice the smoothness of the edge of circular elements like the @ symbol.

GPS is more erratic on the DEFY compared to the iPhone. Some of this could be attributed to Waze’s software, but when sitting at a traffic light the GPS software has a difficult time knowing that the device is stationary. The position icon will move back and forth and the map will spin around thinking the device’s direction has changed. Never saw this behavior on the iPhone.

Android UX
This being my first Android device, I actually expected a much worse experience after using iOS. There’s no question Android isn’t as polished as iOS, but it’s by no means unusable. The biggest fault in Android is the lack of consistency when interacting with similar UI elements across apps. For example, when using a text field, selected text is the usual indicator that text will go away when the user starts typing. This appears in Android, but only half the time. In other instances when interacting with a text field, the text will not be selected yet still disappear when the user begins to type. That makes me question whether or not the text I’m about to modify is about to all disappear.

Selected text This text is clearly selected.

There are tons of little things like that throughout the OS. When the screen is off, iOS will light the screen and display a notification when receiving a text message. Android does not do this, so it’s difficult to know when notifications are arriving if you have the phone on silent.

Pinch and zoom gestures aren’t as smooth as iOS. Icons aren’t very well designed and often use unnecessary animation. The icon indicating that GPS is in use is an animated crosshair. There’s no reason to create that extra distraction for the user.

The row of hardware based buttons at the bottom of the screen are silly. That should all be in the software UI.

Individually, these are all small annoyances, but they can add up to a less than pleasant experience.

Android does offer a few nice items, including the ability to have on-off switches for Wi-Fi, GPS, etc. on the home screen. They’re not there by default which is great for normal users, but it’s nice to have the option for power users. I do like Google’s voice commands. I find myself using that quite a bit for alarms, weather conditions and measurement conversions.

Most of these complaints can be attributed to the DEFY running Android version 2.3. Next month that will be a two year old operating system. In the tech world, that’s a lifetime. Unfortunately, with the Google/handset manufacturer track record, I’ll be amazed if I see a newer version of Android show up on this phone.

Wi-Fi Calling
I think this is where Republic Wireless will really shine once the service moves out of beta. Wi-Fi call quality placed on a connection without heavy network traffic is quite superb. Audio has much more clarity than a traditional call placed over a cellular network.

On the shared public Wi-Fi at work (which is very user heavy) I do get reports of echoing, or artifacts similar to those heard on services like Skype or Vonage when under similar network conditions. On the receiving end, I’ve never heard audio drop or degrade.

I had to make some tweaks to my router at home to hear both ends of a Wi-Fi conversation, but once that was done I haven’t had any issues at the house. From what I’ve read on the Republic Wireless forums, the next over-the-air phone update will address some of the router issues that require settings to be changed by the end user.

I’ve only had one person comment on call quality and that’s only been when standing outside of the building where Wi-Fi coverage gets sketchy, combined with the busy network traffic. Not bad for beta.

Cellular Service
Cellular service from Sprint is what you would expect from any carrier: Reliable calls and data based on their coverage map. Data doesn’t appear to be limited or throttled for Republic Wireless users. I recently spent two long weekend completely off Wi-Fi and encountered zero issues.

SMS Messaging
A couple issues to keep in mind here. Text messaging is only mostly supported. MMS messaging is not supported at all. Basic text messaging works without issue, but a lot of services like GroupMe and WoW don’t support sending messages to VOIP based services like Republic Wireless. It does vary service to service. Google Calendar alerts send just fine. But, keep that in mind if you rely heavily on those types of services.

Go for it.
All in all, for the price, the service is completely worth it. If you’re not a heavy phone and data user, Republic Wireless takes care of the basics. Calls are clear and data is abundant. If you don’t consider yourself technically inclined, wait until they get all the kinks worked out and sign-up once they’re out of beta. Adjusting router settings isn’t something the average user should have to deal with.

I’ll be sticking with the service and look forward to what Republic Wireless has in store in the near future.

In Summary

The Good
  • Great price
  • Unlimited calling and data
The Bad
  • 2-year-old Android OS
  • Fiddling with router settings required

A Quick How-to With the aprs.fi API

| Comments

APRS lets users share information (GPS tracks, WX info, etc.) both over the internet and over the air via amateur radio. (See Wikipedia for more about APRS.)

aprs.fi is the go-to site to see current APRS activity in your (or any) area. They also have an API that lets users tie into all this great data.

In the example below I’ve written a small PHP script that demonstrates how to pull data from the API and display that data in your terminal window.


<?php

ini_set( "user_agent", "Midnight Cheese (+http://midnightcheese.com/)" );

echo "\n\nFetching APRS data...\n\n";

function display_APRS() {
	$json_url = "http://api.aprs.fi/api/get?apikey=0000&name=KBNA,KF4KFQ,AG4FW,WR1Q&what=wx&format=json";
	$json = file_get_contents( $json_url, 0, null, null );
	$json_output = json_decode( $json, true);
	$station_array = $json_output[ 'entries' ];
	foreach ( $station_array as $station ) {
		$name = $station[ 'name' ];
		$temp = $station[ 'temp' ];
		$temp = ( ( 9 / 5 ) * $temp ) + 32; // Convert celsius to fahrenheit.
		echo "Temperature is ".$temp."°F at ".$name."\n";
	}
	echo "\n\n";
}

display_APRS();

?>

In this case we’re requesting a list of weather information posted by a handful of different operators. The API returns the data in JSON which is then parsed and displayed. The final output is displayed below.

Fetching APRS data...

Temperature is 82.04°F at KBNA
Temperature is 84.92°F at KF4KFQ
Temperature is 78.08°F at AG4FW
Temperature is 82.94°F at WR1Q

Objects on the aprs.fi map.
Lots of APRS objects displayed on the aprs.fi map.

Old Stones River Road

| Comments

If you look at the Percy Priest Lake area on Google Maps you can often see old roads that lead straight into the water. Recently I noticed such a road parallel to Stones River Road here in La Vergne, and this past fall Merredith and I decided to take a walk to see what we could find.

Old Stones River Road on Google Maps
Old Stones River Road on Google Maps

It turns out there was quite a bit to see of Old Stones River Road. From small drainage bridges to old fencing to the actual roadbed, it looks as though the road was left as-is when Percy Priest Lake was built in the 1960s. The following photos were taken just south of the Hurricane Creek boat ramp area.

Old Stones River Road
Old Stones River Road

One of the drainage bridges along Old Stones River Road
One of the drainage bridges along Old Stones River Road

Old Stones River Road showing fencing and drainage bridge
Old Stones River Road showing fencing and drainage bridge

View of Percy Priest Lake from Old Stones River Road
View of Percy Priest Lake from Old Stones River Road

Switching From WordPress to Octopress

| Comments

Today marks yet another milestone for Midnight Cheese by successfully making the migration from WordPress to Octopress. With web trends moving toward fast, light, and responsive, my WordPress setup was becoming slow, bloated, and stagnant. WordPress functions well, but for a single user it was just too large of a feature set. The slow page-load times really made this obvious.

Octopress is the complete opposite of WordPress. Octopress is all static. No scripting. No DB.

Some History
The current design is the default Octopress theme. That means a redesign is coming soon. I started looking back and realized 2007 was the last time I applied a design update, which is a bit embarrassing.

Octopress is the fourth blogging CMS that the site has operated under. Blogger started everything off very early on, followed by Greymatter (also based on static files) in 2001, WordPress in 2004, and now Octopress.

Midnight Cheese in 2002
Midnight Cheese in 2002

Issues with Disqus
The WordPress import instructions on the Octopress site worked well. The most time consuming part for me was getting Disqus properly set up. It turns out you have to enable comments on each post.


  comments: true

This meant a lot of text manipulation across some 470 posts. I used a little perl action to run through the _posts folder and change what was needed in each file. Basically a find and replace action.


  perl -pi -e 's/type: post/type: post\ncomments: true/g' *.html

In addition, to get the comment count to appear on the main index view, I had to implement this Disqus fix from Benjie Gillam.

Image Files and Redirects
Storing my repo on Github, I decided I didn’t want to have all my images and various media files sitting on Github’s servers, or across multiple repos on my various computers. On most posts I often have quite a few images. To solve this I created an assets subdomain and moved everything over, keeping all those large files out of the code base.

Many of my images rank well on Google image search which drives a good bit of traffic to the site. That meant 301 redirects for all my images. And again I used that perl script to run through all the posts to modify image URLs.

To the Future
I love how light and responsive the site has become with the static setup. Next on the list is a fresh design and custom asides. Flickr will probably be top on the aside list.