[PoC] Partially random passwords: or how to protect users passwords from keyloggers with partially random passwords

One of the issues we all face when we login to some online accounts especially on public computers, or on any computer that we do not own, is that there is always a risk to get our passwords stolen especially with keyloggers. If a hacker gets a “copy” of your password, she can log in to your account and do whatever she wants.

As always, using 2 factor authentication can mitigate this issue, since the hacker needs to access your phone as well. But what if using 2FA is not an option, and we want to protect the user even in this case, can we detect,when the hacker tries to login, that it is not the real user but rather somebody who stole her password who is trying to log in?

Partially-Random passwords

One solution for this problem is to use partially-random passwords with a random part that is different each time.
What I mean by that is the following:
let’s assume that my password is 123456789, but when I try to login I will not use this password directly, but rather I’d add some random string to the beginning or to the end (or even in the middle) like this:
564564123456789

and when we receive this password (let’s call it the “raw password”), we will strip away the random part and use the remaining part as a password, and we save a hashed version of this raw password in a used_passwords table. And next time the same user tries to login, we check if this raw password (the real password + the random part) was used before, and we deny access to the account if we find it.
The code will look like this:

public function postLogin(Request $request)
    {
        $data = $request->all();

        $email = $data['email'];
        $rawPassword = $data['password'];
        $user = User::where('email', $email)->first();

        $password = $this->getRealPassword($rawPassword, $user);

        if (Auth::attempt(['email' => $email, 'password' => $password])) {
            if (! $this->usedBefore($user->id, $rawPassword)) {
                $this->saveRawPassword($user->id, $rawPassword);
                return redirect('/home');
            } else {
                Auth::logout();
                return redirect()->route('login')->with('authentication-issue', true)->with('used-password', true);
            }
        } else {
            return redirect()->route('login')->with('authentication-issue', true);
        }
    }

What if the hacker knows exactly which part of the password is random?

This is a valid concern, if our hacker knows that a particular website that implements a partially-random password technique is stripping away the first 6 characters each time, she will just do the same and log in to the victim’s account.

The solution to this problem would be to allow the user to chose herself the length of the random part and its position in the password, and each time we attempt to authenticate a certain user we will use her own “rules” (i.e the position and the length of the random part changes from user to user).

We can even add multiple random parts in the same password, which make extracting the real password from any captured password even harder.

And we can even combine this concept of partially-random password with the concept of password-based roles and action I described in my previous article, to add another layer of security to the accounts by triggering some specific actions if the password is used for the second time.

 

You can find the repo (demo) of this concept here:

https://github.com/djug/partially-random-passwords

How to avoid duplicates when you insert hundreds of thousands of entries into the same MySQL table

let’s assume that you are working on an application that requires all entries on a table (lets call it entries) to be unique.
If we are writing our application with PHP/Laravel the migration (more precisely its up() method) of the table would look like this:

 public function up()
    {
        Schema::create('entries', function (Blueprint $table) {
            $table->increments('id');
            $table->string('parameters_001')->nullable();
            $table->string('parameters_002')->nullable();
            $table->string('parameters_003')->nullable();
            $table->string('parameters_004')->nullable();
            $table->timestamps();
        });
    }

One way to solve this issue (if not the most obvious one that comes to mind) is the following:

use a simple check, create a method called isDuplicated($entry) that searches for the entry in the table, if it doesn’t exist insert it, otherwise, throw an exception.

This method could look something like this (it should be added to the Entry model class):

public static function isDuplicated($inputs)
    {
        $exists = self::where('parameter_001', $inputs['parameter_001'])
                       ->where('parameter_002', $inputs['parameter_002'])
                       ->where('parameter_003', $inputs['parameter_003'])
                       ->where('parameter_004', $inputs['parameter_004'])
                       ->count();

        return $exists ? true : false;
    }

PS: we could add a listener to the insertion, and do the check automatically when we use the Entry::create() method, but for the sake of simplicity we won’t do it in this tutorial.

This solution will work just fine, and it might be a good solution for the majority of cases, but lets assume that your application is big and it inserts millions of entries each day/hour. After a while you’ll definitely notice some duplicate entries.

How would this happen? You might ask, You are already checking the table before inserting. It turns out the problem arise when the application tries to insert the same entry twice on the exact same moment. Since the app checks at the same time whether the entry already exists on the table via the isDuplicated($entry) method, this one will return false for both operations, and we end up inserting the same entry twice.

The first solution that comes to mind now is the following: use some locking mechanism (at MySql level):

either by using transactions:
https://laravel.com/docs/5.5/database#database-transactions
or Pessimistic Locking
https://laravel.com/docs/5.5/queries#pessimistic-locking

in this case, we will make sure that we can’t insert the same row twice, right?

This seems like a very good solution, but if you think about it, when you try to insert a new entry, there is nothing to lock in the first place, we are not updating an existing row (a situation where locking would work just fine) but we are inserting a brand new one.

So the lock mechanism would be worthless in this case.

Another solution that you might think of, is to add an index to the entire table, a key composed of all the fields.

There is a problem that arise here (remember, we are talking about hundreds of thousands of entries) we might experience some performance issues really quickly, having an index on the whole table will slow down operations on it significantly.

Even if we don’t consider performance issues, we still have another problem.

Let say that we are trying to insert this entry twice:

['parameter_001'=> 'value_001',
'parameter_002'=> 'value_002',
'parameter_003'=> 'value_003',
'parameter_004'=> 'value_004'
];

as expected, the first one will get inserted, and the second one will get rejected (as expected as well) due to violation of the index/key we are using.

But if we try to insert the following entry twice:

['parameter_001'=> 'value_001',
'parameter_002'=> 'value_002',
'parameter_003'=> 'value_003'
];

it will get inserted twice without any problem. But why would this happen?

Remember that our fields in this table are nullable, this mean, when we try to insert a new entry, our “global” index will guard against entry that have all sub-keys present and that was already inserted. But if a sub-key is omitted, they check fails, and we end up with a duplicate entry.

even though this solution is not the right one, it gets us close to what we should do, in the sense that we should let MySql handle the issue instead of doing it in the code, we can use the same concept without compromising performance and without failing if a “sub-key” was omitted.

The solution for this problem would be to create an additional row that represents the hash of the entry, and lets call it hash, and this hash would play the role of a unique key, so when we try to insert the entry, we add its hash to it, and try to insert, if the operation goes through, i.e Mysql doesn’t throw any exception, then the entry is indeed unique, otherwise, MySql will throw an exception since it can’t insert two rows with the same unique key.

First we need to add this line to the up() method of the Entry table migration:

$tablestring('hash')→unique();

now, there is one more thing we should take care of in this situation, we need to make sure that the hash is unique, and there are no collusions. In other words, we can’t use one way hashing like md5 since we will eventually end up with two entries that have the same key, but something like base64 (or even just concatenate all the fields) would work

PS: don’t forget to add the fillable property to model class

protected $fillable = ['parameter_001', 'parameter_002', 'parameter_003', 'parameter_004', $hash];

another thing that we should definitely consider when we create the hash is to add a separator between the fields before we generate the hash, in other words, when we try to insert the following entry:

['parameter_001'=> 'value_001',
'parameter_002'=> 'value_002',
'parameter_003'=> 'value_003'
];

we should generate the hash for something like value_001-value_002-value_003

PS: do not use a simple implode() method here, since this will ignore fields that are not present in the entry.

the reason behind this, is that if we don’t add any separator, we would have false positives, when we have some missing parameters, and we are using the same value with different fields.
For instance, these following entries are not duplicates (i.e should be inserted without any problem) but MySql will reject them:

['parameter_001'=> 'value_001',
'parameter_002'=> 'value_002',
 'parameter_003'=> 'value_003'
];

and

['parameter_001'=> 'value_001',
'parameter_002'=> 'value_002',
'parameter_004'=> 'value_003',
];

because if we concatenate without any separators, we will end up with the hash of value_001value_002value_003 in the two cases, whilst when we add a separator, we will be generating the hash of value_001-value_002-value_003 for the first entry, and value_001value_002--value_003 (notice the double - sign) for the second one, which will generate a different hash than the first one.

To sum up, each time we want to insert a new entry, we will generate a unique hash for this entry, and we insert the hash with the entry, and since the hash has an index on it, duplicate entry won’t get inserted and Mysql will throw an exception instead.

What about the other exception that mysql might throw and that has nothing to do with the deduplication?

If we want to catch the exception related to this dedpulication mechanism (we might want to log it somewhere, or execute some logic when this happen), then catching all Exception won’t be a good solution, since MySQL might be throwing an exception for a completely different reason other than the duplcation exception.

To solve this issue, we need to analyze the exception thrown and execute the right logic accordingly.
If you take a look at Mysql documentation regarding the exception, you can find information about the duplication exception that we want to handle:
https://dev.mysql.com/doc/refman/5.7/en/error-messages-server.html#error_er_dup_entry

as you can see, when we try to insert a duplicate entry, MySQL throws an exception that has the sql state 23000 and error code 1062, so we can write a method to verify this:

 private function isDuplicateEntryException(QueryException $e)
  {

      $sqlState = $e->errorInfo[0];
      $errorCode  = $e->errorInfo[1];
      if ($sqlState === "23000" && $errorCode === 1062) {

        return true;
      }

      return false;
  }

and then we can add this check in our code as follow:

try{
// insert the entry
 }
    catch (QueryException $e) {
           if ($this->isDuplicateEntryException($e)) {
            throw new DuplicateEntryException('Duplicate Entry');
           }

           throw $e;

    }