Using Nodejs to record microphone input to mp3 files on Ubuntu was very helpful.

Install lame mp3 encoder if you don’t have it.

sudo apt-get install lame

You should already have arecord, which records audio sends it to stdout.

  1. Run the command alsamixer to see your audio inputs and tweak volumes
  2. Run the command arecord -f cd | lame – out.mp3 to record audio to an mp3 file called out.mp3 until you hit ctrl-c

Now do that with Nodejs!

OK! This will record audio until you exit the script with Ctrl+C.

const spawn = require('child_process').spawn;

// prepare 2 child processes
const recordProcess = spawn('arecord', ['-f', 'cd'])
const encodeProcess = spawn('lame', ['-', 'out.mp3'])

// pipe them

// get debug info if you want
recordProcess.stdout.on('data', function (data) {
  console.log('Data: ' + data);
recordProcess.stderr.on('data', function (data) {
  console.log('Error: ' + data);
recordProcess.on('close', function (code) {
  console.log('arecord closed: ' + code);

// this seems like a good idea, but might not be needed 
process.on('exit', (code) => {
  console.log(`About to exit with code: ${code}`);

Fixing SSL errors on Android Chrome with RapidSSL

Your ‘Order: xxxxxxx Complete’ email from RapidSSL includes links to a bunch of intermediate SSL certificates. Will you install the right one? I have seen installing the incorrect intermediate SSL certificate into the certificate chain cause Chrome on Android to declare a site insecure and block users from accessing it, while every other browser accepts it.

This tool helped me fix the issue: It pointed out that the wrong intermediate certificate was in the chain, and directed me to download a RapidSSL SHA256 CA - G3 certificate. I chained that with my server’s SSL certificate, and my Android issue went away.


rsync files from vagrant box to AWS server

I just transferred a directory full of audio files that I didn’t want to check into a project’s Git repository with this command:

 rsync -avz -e "ssh -i /home/vagrant/key.pem" ./source-directory


  • key.pem is the same file I usually SSH into the host AWS server with
  • copy the key file from the Vagrant box’s host machine (my dev machine) into a non-shared directory in the Vagrant machine (/home/vagrant), and set the permissions on it to 0600, otherwise rsync complains that the key has the wrong permissions and the command does not work.
  • a flag means archive mode which preserves more properties of files and is recursive
  • v flag means verbose
  • z flag enables compression
  • e flag specifies the remote shell user to use (the usual user I SSH as). This is the most important flag to understand for my use case

more rsync info and more rsync examples

NodeJS setup tips for DevOps

These are steps I took to help secure a NodeJS web application and keep running reliably on Ubuntu 12.04 on an Amazon EC2 server.

Stop running NodeJS as root and use port fowarding

Because if someone manages to hack some aspect of your application, they could do a lot of damage to your server.

The downside of running a NodeJS process as a non-root user means it can’t read traffic from port 80 or 443, which poses a problem for many web applications. The solution is port forwarding. Here’s an excellent write-up on port forwarding with iptables.

But, next the time the server is rebooted, your iptables config will be gone. They weren’t permanent. To save the configuration, check out Solution #1 in the iptables howto and this stackoverflow Q&A. They both describe the same method, but one page makes it look more complicated than the other.

Reboot the server and view your iptables to check the settings are still applied.

Open File Limits

Linux puts limits on the amount of files a user can have open at once. You can see the limit with the command ulimit -n. Linux also counts open network connections as open files. Using for realtime web applications causes at least one connection to open (and stay open) as users come to your site, leave it open in a browser tab, and then go somewhere else. On a default Ubuntu setup, ~1000 idling users with connections open may be enough to bring your NodeJS app down with this error:

Error: EMFILE, Too many open files

You can temporarily set the open file limit higher for the current logged in Linux user with the command ulimit -n 5000, however this change will be wiped out as soon as the Linux shell user logs out. There’s a good blog post on updating ulimit numbers on which outlines the steps make the change permanent, even after a reboot.

I’ve gone live already! Is it too late?

If you can swap your active web application server(s) from one machine to another without losing data, it isn’t too late.

In my case, I had a NodeJS app running on an AWS EC2 server, and a database hosted somewhere else by MongoLab. I was able to:

  1. Make a copy of the EC2 server by making an AMI, and launching a new server from it
  2. Make the iptables and ulimit changes on the new server and reboot it to test changes stuck
  3. Check the website was still accessible on the new server by accessing it via its Public DNS (a URL like
  4. Pointing the domain to the new server by re-associating the Elastic IP with the new server
  5. Shutting down the old server.

Installing OptiPNG 0.7.3 (or newer) on Ubuntu 12.04

Unfortunately sudo apt-get install optipng installs an outdated version of OptiPNG (0.6.x). To get the newest version, you’ll have to compile from source code. And that ain’t bad.

This tutorial assumes you have build-essentials library installed. If not, you can check out this Compiling: Easy How To article on Ubuntu to get set up and gain a better understanding of what is going on here.

  1. Download the source code with this command. This may work if SourceForge doesn’t change their URL structure.

If the that URL doesn’t work, start searching for the new download link at the OptiPNG site.

  1. Untar it
    tar xvf optipng-0.7.3.tar.gz
  2. Go into the extracted folder, compile, and install it
    cd optipng-0.7.3
    sudo checkinstall

    This will compile OptiPNG and make it accessible by commandline.

  3. Try out OptiPNG! Run this command to see the version number:

    optipng -v

    The result should be something like this:

    OptiPNG version 0.7.4
    Copyright (C) 2001-2012 Cosmin Truta and the Contributing Authors.
    This program is open-source software. See LICENSE for more details.
    Portions of this software are based in part on the work of:
      Jean-loup Gailly and Mark Adler (zlib)
      Glenn Randers-Pehrson and the PNG Development Group (libpng)
      Miyasaka Masaru (BMP support)
      David Koblas (GIF support)
    Using libpng version 1.4.12 and zlib version 1.2.7-optipng

You are ready to optimize PNG files now!

EC2 + Ubuntu 12.04 + LAMP 2 Server Setup checklist

My steps:

Install stuff

sudo apt-get update
sudo apt-get upgrade
sudo tasksel install lamp-server
sudo apt-get install subversion phpmyadmin php5-curl
  • Add Include /etc/phpmyadmin/apache.conf to somewhere in your Apache Config files to enable it
  • Use to generate passwords
  • Uncomment store-plaintext-passwords = no in /home/ubuntu/.subversion/servers so svn stops bugging you about it




  • AWS – set up CloudWatch to watch Status and CPU usage

Other Tweaks

error while loading shared libraries

Just installed ffmpeg on Ubuntu 12 and tried running it only to see this error?

ffmpeg: error while loading shared libraries: cannot open shared object file: No such file or directory

ffmpeg is looking in the wrong spot for the library you installed. Try this bash command:

ldd `where ffmpeg`

And look for a line like this in the result to confirm the issue: => not found

How to fix it

Using this additional command fixed my problem.

sudo ldconfig -v

This updates the lists of shared libraries cached in Ubuntu, and now ffmpeg can find that missing shared library.

Find old bash commands in Linux

Use a wicked one-liner last week, but forget what it was? Search your bash history with grep! If you remember part of the command, you may be able to dig it up.

I was messing around with php-cs-fixer last week, and could not remember what commands I tried out. Searching with

history | grep fixer

found me:

 1333  php-cs-fixer
 1337  php-cs-fixer fix src --dry
 1338  php-cs-fixer fix --dry-run src
 1339  php-cs-fixer fix --dry-run --level all  src
 1340  php-cs-fixer fix --dry-run --level PSR2  src
 1341  php-cs-fixer fix --dry-run --level psr2  src
 1342  php-cs-fixer fix --dry-run --level=psr2  src
 1343  php-cs-fixer fix --dry-run --level=all  src
 1344  php-cs-fixer fix --dry-run src
 1345  php-cs-fixer fix src
 1346  php-cs-fixer fix src | grep sdb
 1347  php-cs-fixer fix src
 1348  sudo php-fixer self-update
 1349  sudo php-cs-fixer self-update
 1350  php-cs-fixer fix src
 1352  php-cs-fixer fix src
 1356  php-cs-fixer fix src
 1357  php-cs-fixer fix src --level=PSR2
 1358  php-cs-fixer fix src --level=psr2
 1424  php-cs-fixer

List modified files with SVN in Linux

When the command svn status returns too much information, filter it with grep. This line finds only modified files and leaves out deleted, missing, and unversioned files.

svn status | grep ^M