Different ways to extend functionalities of Django Auth User model

I was working on creating an automated way of creating a one-time use login code that will automatically authenticate a user account. The issue I encountered was how to store the created code for the user model and where to kick off the logic to generate and save the code. My search for a solution led me to how to extend Django Auth User. I’ve not used post_save signals before and found it a great tool to link functionalities between models.


For all those command line ssh users out there who have encountered WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED when they know they are connecting to the correct device/server, here is a way to prevent this error from appearing so you can get your work done quickly.

First, there are options in ssh to ignore host key checking to suppress this warning. You can do use the following options when making starting an ssh session.

ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no [user]@[ipAddress]

However, that is a bit cumbersome to type every time when we are all used to just typing

ssh [user]@[ipAddress]

So to ease the pain, you can add a new command via .bashrc to do this automatically. In Ubuntu, you can edit your /home/[user]/.bashrc and add

sshignore() {
ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no $1
alias sshi=sshignore

After you save the changes to .bashrc, reload it by doing

source /home/[user]/.bashrc

Now you can just do the following when starting an ssh session and suppress the warning.

sshi [user]@[ipAddress]

NOTE: Obviously, ssh defaults to showing you this warning for security reasons and you should know why are want to circumvent the warning. So only use it in situations such as reconfiguring ipAddresses for many devices over and over again where the warnings are just annoying.

Webstorm with Dracula theme making side panels unreadable

Just a note that when I switched to Dracula theme for my Webstorm IDE version 11.0.1, all the content in the side panels become unreadable unless they are clicked. However, the simple solution to fixing this is to close the application and restart it. Just a slight bug that I can live with.

USB security keys

So after setting up password storage on LastPass, I decided to find out how else to make my online login process more secure and less painful. This is when I came across this story about Facebook supporting USB security keys. So I decided to look into what a YubiKey 4 can do and I just ordered one. This device promises 2 factor authentication without the need to receive a text on your phone or have a key generator on your cellphone. I’ve used those other methodologies in the past for 2 factor authentication on gmail and github. But they are error prone as I am not very good at remember six digits and typing them back on my laptop. It also slows down my login process which of course I am not a fan off when trying to get things done fast.

There are geeky features which I always like. It can store key pairs so I can use the YubiKey 4 to ssh without having a private key stored on my hard drive. That prevents someone from stealing my private key if my laptop gets compromised. I found this how to YubiKey-SSH that explains it.

Anyway, more security for the new year!

Storing passwords

I’ve been searching for different ways to store all these precious passwords that I’ve scattered throughout the web. I’ve gone from storing all my passwords in a file on my computer and setting the permissions to that file so only I can read and write. But that felt like I was putting all my eggs in one basket which I am the sole protector of. How much time do I really put into security each day, you might ask. Well, the answer is as you would guess; not much.

So my next attempt of solving my password security issue was to never write down my passwords, but to create an algorithm stored in my brain that generates passwords. Every time I needed to login to a website, I would run through my algorithm and figure out what my password should be. It required a bit more brain processing cycles, but it worked. However, the downfall to that methodology is changing passwords when needed. Everything worked fine until some website got hacked and their user credentials stolen. I would have to change my password for that website which would require me to come up with another algorithm. Then I had two algorithms to remember and know which algorithm to use for each site I visited. Over time, that methodology got complicated and took up brain power that could have been used toward dreaming what cool thing I could buy on Amazon.

So fast forward to the present, I just setup an account at LassPass to store all of my passwords. It solves some of my issues of remembering passwords and processing algorithms in my head to generate passwords. They spend their days on security to make sure my passwords are safe. It has a Google Chrome extension that will automatically login when I visit sites where I have accounts. Although I’ve not tried this feature yet, but they have the ability to change my passwords on my accounts easily so I can do that often. Using new passwords frequently is a good security practice to fend off breaches.

Now I can get back to applying my brain power towards buying stuff on Amazon.

Connect multiple Carambola2 devices together using Openwrt

I found a fun little network enabled device that runs Openwrt called Carambola2. The developer kit version has two ethernet ports and I wanted to try connecting them all on the same subnet by daisy chaining via the eth0 and eth1 across multiple devices. The requirement is to connect them in a way that any device can connect to any other device on the network but do not use a centralized switch to do so. The following configuration on the lan part of the network configuration shows how I connected eth0 and eth1. The configuration uses staticIP but I have not figured out if dhcp works via the main router yet.

config interface 'lan'
option type 'bridge'
option proto 'static'
option netmask ''
option ipaddr ''
option _orig_ifname 'eth0'
option _orig_bridge 'true'
option ifname 'eth0 eth1'
option gateway ''
option dns ''
option broadcast ''

In this case, the device node is set to Make sure each device has a unique IP within the subnet range.

Setup your own deb repository

It is difficult to setup a Debian repository that is signed and can handle multiple versions of any single package. I went through many different packages and tutorials in how to setup my own Debian repository and it was a pain to find a method that works. My two requirements might be specific to my needs but they give the user the ability to do these two things.

The signed part allows automated scripts to upgrade a package as it allows the following.

sudo apt-get -y install <package>

The ability to handle multiple versions of the same package would allow the user to install which ever version of the package with the following command example

sudo apt-get -y install <package>=<version>

Generate a gpg key

We need a gpg key to sign our packages and repository. So let’s generate one before setting up freight.

gpg –gen-key

Select option 4 RSA (sign only)

Fill out all the information that it asks.

Note: Be sure to add an email. It will be needed to configure freight.

One thing that will happen most likely is that there is not enough entropy to generate the key and it waits for entropy. I found a good way to generate some entropy is to run stress. Open another terminal to install and run stress while leaving the gen key running.

sudo apt-get install stress

stress –hdd 8 –io 8

If you want to watch the available entropy get generated, open another terminal and watch with this command

watch cat /proc/sys/kernel/random/entropy_avail

Installing Freight

The package that I found to handle both my requirements was freight. The instructions are there, but I will document what I did just to be complete.

I installed via apt-get and so I needed to add the third party source list before installing

echo “deb http://packages.rcrowley.org $(lsb_release -sc) main” | sudo tee /etc/apt/sources.list.d/rcrowley.list

sudo wget -O /etc/apt/trusted.gpg.d/rcrowley.gpg http://packages.rcrowley.org/keyring.gpg

sudo apt-get update

sudo apt-get -y install freight

Configuring Freight

Copy the example conf

sudo cp /etc/freight.conf.example /etc/freight.conf

Edit the freight.conf and add


Add deb packages

Take a deb file you already created and added to the repository. The apt/squeeze, etc arguments are the different distros where the deb should be published.

freight add foobar_1.2.3-1_all.deb apt/squeeze apt/lucid apt/natty

Build the cache

freight cache

Setting up Nginx

Now we need to serve out the repository over http. You can use any web server to do this. I chose Nginx and here is the setup procedure.

sudo apt-get install nginx

Setup hosting file

cd /etc/nginx/sites-available

sudo vi mydomain.com

Put something like this into the site configuration file

server {
listen 80;
server_name mydomain.com;
access_log /var/log/nginx/mydoamin.com.access.log;
error_log /var/log/nginx/mydomain.com.error.log;

location / {
alias /var/cache/freight/;

Enable the new site

cd /etc/nginx/sites-enabled

sudo ln -s /etc/nginx/sites-available/mydomain.com mydomain.com

Restart Nginx

sudo service nginx restart

Consume packages

Now to install the packages from your new repository on a Debian machine, add the source list and the key

echo “deb http://mydomain.com $(lsb_release -sc) main” | sudo tee /etc/apt/sources.list.d/mydomain.list

sudo wget -O /etc/apt/trusted.gpg.d/mydomain.gpg http://mydomain.com/keyring.gpg

Now you can do the normal apt-get procedure to install a package

sudo apt-get update
sudo apt-get -y install foobar

AngularJS interval example

One useful technique to keep content synced between client and server is doing continuous polling. Of course websockets are even better, but for this post, I’m going to just talk about the polling technique using AngularJS. AngularJS has an $interval service that can do this well. However, they do warn that you have to handle manually killing the interval service when you don’t want it to persist anymore like when you destroy a controller. This can be done with the following example code snippet.

var startedInterval;
$scope.startPing = function(num) {
    if (angular.isDefined(startedInterval)) {

    startedInterval = $interval(function() {
        console.log("Ping " + num);
    }, 1000);

$scope.stopPing = function() {
    if (angular.isDefined(startedInterval)){
        startedInterval = undefined;

$scope.$on('$destroy', function() {

var num = Math.random();

This is just a quick demonstration of this technique.

Getting Ionic framework running on Ubuntu

Very frustrating when it appears that Ionic setup is as simple as http://ionicframework.com/getting-started/ to get a helloworld running. But I received all kinds of errors when following those simple instructions. Primarily, there were Cordova dependencies that were missing and are not installed if you only follow Ionic instructions.

So here they are. After installing cordova and ionic

sudo npm install -g cordova

sudo npm install -g ionic

Install ADT


Unzip to somewhere that makes sense for you on your home dir

Add paths

edit your .bashrc and add this to the bottom

export PATH=$PATH:/[path to adt]/adt-bundle/sdk/platform-tools:/[path to adt]/adt-bundle/sdk/tools

Save your .bashrc then run

source ~/.bashrc

Then install ant

sudo apt-get install ant

Then create a AVD

android create avd -n <name> -t <targetID>

you can list targets to see what you have available

android list targets

Now that everything else is installed and an AVD is created, you can continue with the Ionic instructions.

$ cd myApp
$ ionic platform android
$ ionic emulate android
$ ionic run android <-- still getting an error on this one