outten.net - random thoughts

Switched to Hugo

Several years ago, I migrated to my own blog application in which I used Ruby and Couchdb. 5 years later and I’m switching to to a static site generator called Hugo. Hugo is written in Golang and has some very nice features like templates, themes, taxonomies and more.

There are many good generators out there, but there were a couple of things I liked about Hugo.

  • Since it is written in Golang, there is a static binary available so there is no need to download dependencies.
  • Automatic reloading of the page which is “watch” mode with Hugo.
  • Generates pages very quickly and there’s some nice activity on the mailing list.

We’ll see how it goes, but hopefully, I end up writing more posts.

BeagleBone Setup

For Christmas (or just after), I got a BeagleBone to try out some of the many input options that board has. I have done programming from the web and server perspective (a little GUI work), but I have not tried programming something that can interact with the physical environment. Initially, I just planned to get up to speed some with electrical circuits and gain a better understanding of some things that I took in college (Electrical Engineering I and II). Basically, I’ll make some LEDs light based on a program I write. I will probably start with Nodejs since the Bonescript is a pretty nice library.

Initial Setup

By default, the BeagleBone comes with the Angstrom distribution installed on the MicroSD. I wanted to get some applications installed on it that I was familiar with on other Linux distributions. This includes:

  • mg – micro emacs
  • nginx – web server
  • tmux – terminal multiplexer
  • mosh – mosh shell for roaming and reconnecting

For the most part, these all installed fairly easily with the standard “configure, make and make install”. There was one of caveat to that.

Installing mosh

mosh was the one problem library. mosh depends on protobuf and in order to get mosh to see protobuf, I needed to set PKG_CONFIG_PATH when configuring mosh (found on this thread).

PKG_CONFIG_PATH=/usr/local/lib/pkgconfig ./configure

After that, mosh compiled without a problem.

Background Processes

After getting tmux installed, I thought I would give it a try. I created a session and then detached the window. I then re-attached without a problem. Things seemed to be working fine. I then deattached and closed my ssh session. I then tried reconnecting and re-attached my tmux session (tmux attach). This did not work. I then hunted around and found other discussions on this but any of the recommendations people had did not work for me. After much searching and looking around, I ran across this discussion.

An the Angstrom distribution uses the systemd system and service manager. The configs for setting up systemd services are in /lib/systemd/system/. Angstrom being an embedded system appears to favor lighter weight libraries. For ssh, it uses dropbear and the file we are interested in is:

/lib/systemd/system/dropbear@.service

In this file, we need to tell the systemd to not kill process on ssh exec. To do this, we need to edit (or add) the KillMode line. It should look like this:

KillMode=process

The value of “process” tells systemd to only kill the main process. In this case, it will only kill the dropbear process and not other processes like tmux. After making this change, I rebooted to make sure it was in effect. I was then able to start a tmux session, deattach, end my ssh session, reconnect via ssh and attach to the tmux process.

Along the same lines, mosh requires a background process to remain running after connection via ssh to start it. Needless to say, mosh wasn’t working originally, but after this change, it works like a charm. Now that I have my BeagleBone setup, I’m readdy to start figuring out my first simple circuit to build.

First experiences with "async" library in Node.js

I have been hacking on some JavaScript code recently in particular with Node.js. With the hype around Node.js, I wanted to see what it was like to develop in JavaScript on the server. With Node.js’s heavy use of event-driven programming, I can see how there is real power there. It reminds me of some of the limited use of actors I have worked on or tinkered with.

This article might be more of a reference for me, but I did think it was an example of clever use of scopes in JavaScript. The problem I was trying to solve was running an SQL query to see if a row already existed in a table based on the key value and if it didn’t exist, go ahead and insert the row. I know I could have done it with a few callbacks or nesting functions, but I went in search of something better.

I had found several references to the async library. In particular, I thought the waterfall function would be a good fit with the exception of one thing. It wasn’t immediately obvious how to pass the key parameter to the first function which checked to see if the key was already used. I figured I could put all of the functions within a function and then reference some variables in the scope of the surrounding function. It would turn out something like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
  addEntry: (value, callback) ->
    key = value.key
  
    # function to find if the key exists
    doesKeyExists = (callback) ->
      sql = """
        select count(*) as cnt
        from entries
        where key = ?
        """
      @db.get sql, key, (err, row) ->
        callback(null, row)

    # function to insert te row if it doesn't already exist
    addEntry = (exists, callback) ->
      @log.debug "_addEntry exists: #{util.inspect(exists)}"
      params = {$key: key, $value: JSON.stringify(value)}
        if exists.cnt > 0
          callback(null, false)
        else
          sql = """
        INSERT INTO entries
        (created_at, key, value)
        values (date('now'),$key, $value)
        """
          @db.run sql, params
          callback(null, true)
  
    # do methods in order and pass the callback values
    async.waterfall [ doesKeyExists, addEntry ], (err, result) =>
      if err
        @log.error "Error: #{err}"
        callback(err)
      else
        callback(result)

I didn’t actually try this, but I was thinking of something similar. Unfortunately, testing the inner functions is difficult because they rely on the context of the surrounding function. I was in search of a better solution. In searching, I found this answer on Stack Overflow. It really related to what I was trying to do. I wanted to pass a parameter to the functions that got passed to the async.waterfall tasks, but I didn’t want to have to embed them. Here is an excerpt of what I came up with:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
  addEntry: (value, callback) ->
    key = value.key
    async.waterfall [ @_doesKeyExists(key), @_addEntry(key, value) ], (err, result) =>
      if err
        @log.error "Error: #{err}"
        callback(err)
      else
        callback(result)

  _doesKeyExists: (key) ->
    (callback) =>
      sql = """
            select count(*) as cnt 
            from entries
            where key = ?
            """
      @db.get sql, key, (err, row) ->
        callback(null, row)

  _addEntry: (key, value) =>
    (exists, callback) =>
      @log.debug "_addEntry exists: #{util.inspect(exists)}"
      params = {$key: key, $value: JSON.stringify(value)}
      if exists.cnt > 0
        callback(null, false)
      else
        sql = """
              INSERT INTO entries
              (created_at, key, value)
              values (date('now'),$key, $value)
              """
        @db.run sql, params
        callback(null, true)

By calling a function with the parameters and then returning a function with the signature expected by the async.waterfall, the variables are available because they are in the scope. This reminded me of the information hiding examples in JavaScriptThe Good Parts.

Let’s breakdown the _doesKeyExists function:

_doesKeyExists(key) (callback) => …

With some of CoffeeScript’s goodness (last expression returned and shorter function declaration), an anonymous function is returned with the callback signature. key is available within the anonymous function because it is in the outer scope.

By separating _doesKeyExists and _addEntry functions, these can now be tested separately, but can also be used in the sync.waterfall call:

async.waterfall [ @_doesKeyExists(key), @_addEntry(key, value) ], …

This seemed like a clever approach to use within Node.js, or in the browser, to pass parameters to a series of async calls. Again I’m relatively new to evented programming so there is probably a better way to do this, but I could still see this being useful in future hacking.

My 1st Experiences with R

I decided to take a look at R this weekend between our family events. I had looked at R before when I ran across the R Tutorial. I bookmarked it and decided I would come back later. The other day at work a professor performing big data analysis started a conversation about R and offered to create some examples. He recommended working through the examples and tweaking those as a way to learn R.

Upon further consideration, I decided to go back to R Tutorial and work through the basics of input and data types. I felt like I needed a base before trying some of the examples. This approach seemed to work well-at least for me. I worked through the first 2-3 sections of the tutorial. When I got to the plotting section, things got interesting. Creating plots with the built-in functions seemed very straightforward and powerful. This led me on a quest to find more plotting libraries with subjectively more visually appealing graphs (the default plots aren’t too shabby).

googleVis

I had used Google’s Charting Tools before and was curious to see if there was an option to use them from R. It turns out there is an excellent library called googleVis. This library creates the HTML, JavaScript and CSS required to create graphs from R. The resulting graphs are visually appealing and interactive. The one downside was, since the rely on the Google Charting Tools, they require an Internet connection to pull down the required JavaScript from Google. I will probably use googleVis in the future for some charts, but it wasn’t ideal for my current thought.

ggplot2

The next graphing library I looked at was ggplot2. From the website examples, this library appeared to be exactly what I was looking for. The documentation in the reference manual for each of the functions is well documented. When I started trying to use the library, I had already pulled in my dataset from a CSV file. My initial problem was figuring out how the data was passed to the ggplot function and how this related to the qplot function. The “grammar of graphics” was getting the best of me. After searching the web, I was able to find the R Cookbook which had more examples of scatter plots-not to mention a lot of other good R information. These examples provided the missing link for me: how to supply the data to the graphing functions to get the plots to work. With the combination of ggplot2 and R Cookbook, I was able to create graphs that provided some additional insight into the data.

Sample density graph

Other Thoughts

Some other things I wanted to note:

  • Installing packages available on CRAN are extremely easy and they just worked.
  • After starting with the binary for R, I then found RStudio. It looks like it is in early development, but it quickly became my default environment. With an editor, workspace and console, it was hard to find something better.
  • R has a “batch” processing mode (plus Rscript) which looks interesting for processing data outside the environment.

These were just some of my early experiences with R. Overall, I really enjoyed using R and the ideas started flowing on how I might use it-everything from analyzing spending at home to analyzing data at work. Next, I hope to look at using R with ggplot2 to analyze the results from Apache Bench. If the results turn out interesting, I hope to find time to share them.

General Virtual Box SSH Setup

Running a virtual machine is extremely handy for development or trying out different configurations. VirtualBox is handy virtualization software especially since it is free. My goal was to setup a virtual machine to familarize myself with a few different configuration management tools. I could have tried Vagrant, but since we use Centos for most of our servers and Vagrant defaults to Debian, I went ahead and installed Centos myself from a boot.iso.

I wanted to start my Centos virtual machine in VirtualBox and then minimize it. I planned on using ssh to connect from my host machine’s terminal. Since my guest machine was running in NAT mode, I had to tell Virtual Box to forward a port from my host machine to my guest machine. I decided to forward port 2222 on my host machine to port 22 on my guest machine for ssh. From the Terminal, I ran the follow commands:

VBoxManage setextradata "Centos5" "VBoxInternal/Devices/e1000/0/LUN#0/Config/ssh/HostPort" 2222
VBoxManage setextradata "Centos5" "VBoxInternal/Devices/e1000/0/LUN#0/Config/ssh/GuestPort" 22
VBoxManage setextradata "Centos5" "VBoxInternal/Devices/e1000/0/LUN#0/Config/ssh/Protocol" TCP

Your VM name “Centos5” could be different as well as the “VBoxInternal…” path. Once this was done, I booted my VM and then I was able to ssh to the guest:

ssh -p 2222 127.0.0.1 

From there, I was able to ssh to my local VM and try out some different configuration management tools.

RPCFN #4 in Erlang

A few weeks ago I wanted to start learning Erlang. A co-worker pointed out the Ruby Programming Challenge for Newbies that they were completing in Ruby. I decided to try the RPCFN #4, but write it in Erlang. This probably isn’t the most concise or best implementation, but it was a good exercise to encourage me to look at Erlang.

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
%% This was inspired from the ruby programming challenge for newbies.
%%   http://rubylearning.com/blog/2009/11/26/rpcfn-rubyfun-4/

-module (polynomials). %% -compile(export_all). -export([poly_epr/1]). -import(string, [concat/2]). -import(lists, [append/1]).

%% include the test module -include_lib("eunit/include/eunit.hrl").

%% Create a polynomial expression from an array of numbers polyepr(List) when is_list(List), length(List) >= 2 -> P = gen_epr([], List), case R=join_epr(P) of "" -> "0"; -> R end; polyepr() -> {error, "Need at least 2 coefficients"}.

%% generate the polyonmial expression genepr(Poly, []) -> case Poly of [] -> "0"; -> Poly end; gen_epr(Poly, [H|T]) -> Poly ++ gen_epr([term(H, length(T))], T).

%% join the expressions term together join_epr([]) -> "0"; join_epr([H|T]) -> H ++ append([check_neg(X) || X <- T]).

%% add appropreiate sign in front of expression term checkneg([]) -> ""; check_neg(Val="-" ++ T) -> Val; check_neg(Val) -> concat("+",Val).

%% create an expression term term(1, Expo) -> expo(Expo); term(-1, Expo) -> "-" ++ expo(Expo); term(0, Expo) -> ""; term(Val, 0) -> integer_to_list(Val); term(Val, Expo) when is_number(Val), is_number(Expo) -> concat(integer_to_list(Val), expo(Expo)); term(Val, _Expo) -> "".

%% create the exponent expression expo(1) -> "x"; expo(E) -> concat("x^", integer_to_list(E)).

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %% TESTS

%% term tests term1_test() -> "x^2" = term(1,2).

term_negative_value_test() -> "-x^2" = term(-1,2).

term0_test() -> "" = term(0,5).

term_zero_exponent_test() -> "5" = term(5,0).

term_bad_values_test() -> "" = term("str","more").

%% poly tests from the rpcfn poly_epr1_test() -> ?assert("3x^3+4x^2-3" =:= poly_epr([3,4,0,-3])).

poly_first_negative_test() -> ?assert("-3x^4-4x^3+x^2+6" =:= poly_epr([-3,-4,1,0,6])).

poly_simple_test() -> ?assert("x^2+2" =:= poly_epr([1,0,2])).

poly_first_minus_one_test() -> ?assert("-x^3-2x^2+3x" =:= poly_epr([-1,-2,3,0])).

poly_all_zera_test() -> ?assert("0" =:= poly_epr([0,0,0])).

poly_test_error_test() -> {error,Msg} = poly_epr([1]), ?assert("Need at least 2 coefficients" =:= Msg).

You need to have eunit setup in your code path. Then you can start the Erlang shell and run the tests. Really, they pass!

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
$ erl
Erlang R13B03 (erts-5.7.4) [source] [smp:2:2] [rq:2] [async-threads:0] [kernel-poll:false]

Eshell V5.7.4 (abort with ^G) 1> c(polynomials). {ok,polynomials} 2> polynomials:test(). All 11 tests passed. ok 3>

Fast and Friendly Autotest for your Mac

Autotest, which is part of ZenTest, is a very handy testing application. It runs tests as changes are made to the code. When using it, I would accidentally leave it running and then notice something using up CPU cycles. It would turn out to be the autotest process that was still scanning files for changes every so often. I would then stop autotest only to be bothered to start it up again when I was working on the project again.

Awhile ago, I ran across autotest-fsevent for the Mac. It uses the Mac’s FSEvent core service to determine which files have changed (or to be notified when they are). This appears to have really helped the CPU cycles especially when nothing has changed. It is immediately notified when a file has changed.

I would also recommend upgrading to the latest autotest-growl as well. I was using an older version and there have been improvements.

WhiteKnightTwo at Oshkosh

This year I made the trip to Oshkosh for the EAA Airventure Airshow. One of the coolest planes I got to see there (and there were many nice planes) was the WhiteKnightTwo. The last time I was at Oshkosh, I was able to see Scaled Composites Boomerang which was an aerodynamic wonder. Looking at the WK2, the 2 fuselage approach reminded me of the Boomerang, but was different because of the symmetry of the plane. In looking for differences, the right fuselage had on extra exhaust on the upper outside in the back. I could not reason what this could be used for, but it was a difference. It also appeared that there was more visible wiring in the left fuselage presumably for instruments for flying.

Overall a fascinating aircraft especially if it allows for the start commercial space flight. I can’t wait to see it happen and the folks at Scaled have done some pretty remarkable things.

Task Break Down

Last week, my wife and I put down new mulch in the natural areas of our yard. As I was loading cart after cart of mulch to lug across the yard, I started thinking we might not be able to get all 8 yards of mulch spread that day. I really wanted to get this done so I didn’t have to worry about it over the weekend. Looking at the pile, it did not seem to be getting any smaller and it was approaching lunch time. A thought came to mind: how can I make this into a smaller task that I could accomplish in a few hours instead of looking at the entire pile over the period of a day? What if I worked to split the mulch in the middle into two separate piles (dividing and conquering)? That could make the task interesting (kids might like it) and give me a smaller goal to attempt to reach.

How many times do I find myself asking that question? How can I break task X down into something smaller so I can feel like I am accomplishing something? My understanding of goal setting comes from studying of GTD, Agile development processes and a history of playing basketball. GTD and Agile development have taught this as some of their core concepts (if I understand them correctly). The further along I got playing basketball, I found we were always breaking down plays or watching videos in smaller chunks to analyze how we could get better. The task of breaking down issues into smaller tasks seems to be fairly important and a skill that I frequently find myself using (as long as I don’t over plan).

Back to my pile of mulch. I was able to get it divided into two halves.

divide

From there, I proceeded to break it into smaller tasks. I focused on the smaller half first and then started breaking off the corners of the larger half that was left. It sure did help and we were able to get all of the mulch spread by the end of the day (YAY!). Once again, setting smaller goals, although they may not have made a difference in the speed that I got my part done, they did help me focus on small units of work that I needed to get done.

Team Values

While playing sports (namely basketball) for a number of years, I played on a number of different teams and had a number of different coaches (2 main coaches in high school and college). Looking back, it was very interesting to see the different styles in coaches and players. I was pretty lucky for the most part, the majority of my experiences were on teams that understood the concept of “team play”.

One thing I didn’t completely realize was how important the coach’s leadership was and the values they instilled. When a player first joined the team, they wouldn’t completely fit in or at the very least they would struggle a litte. Having these core values, provided by the coach originally and promoted by upperclassmen, the freshmen (or new transfer) has a base to build from. As they grow as a player and person (going from freshman to sophomore and so on), their playing skills would develop, but they would also continue to ‘buy-in’ (or gel) to the team values. Of course, this happened at a different pace for everyone depending on the player.

I am starting to see that again in the agile software development. As teams shift and grow, you go through this process over and over. I think it is important to have those underlying values, but the overall appearance of the team will reflect the current players. I am learning there is a delicate balance between emphasizing values, letting the team gel and using the strengths of the new players. If too much emphasis is placed on values, it suppresses the strengths of new players. If the values are shifted too much, you lose the history that has brought you success in the past. I feel like it is somewhere in the middle that allows your team to gel the quickest depending on the number of returning starters you have from the previous season (or project).