Monday, December 23, 2013

sqlite insert or replace to avoid primary key violation

Notice how the second insert fails because one of the primary keys is not unique.  However, if you use insert or replace, the insert command acts somewhat like an update.

sqlite> create table a (a int constraint pk primary key, b int);
sqlite> insert into a values (1,1);
sqlite> insert into a values (1,5),(2,2);
Error: column a is not unique
sqlite> select * from a;
sqlite> insert into a values (3,5),(2,2);
sqlite> select * From a;
sqlite> insert or replace into a values (1,9),(2,9),(3,9);
sqlite> select * from a;

Tuesday, December 3, 2013

web site performance

I recently added "app.use(express.compress());" to my node project to gzip responses.  I had some pretty big JSON responses, which are now nice and small.  So I wanted to find some other things to speed up my site.  I found this article:

I can't believe how comprehensive that is!  All the best practices in one spot.  Awesome!

Tuesday, November 26, 2013

Linux find certain files containing text and replace that text

Here we go!  This was a fun command to type out.

  • Find files in the current and sub-directories matching the name *Spec*
  • Files must contain the "search_text"
  • Replace search_text with repalce_text

find . -name "*Spec*" -exec grep -l "search_text" {} \; | xargs perl -pi -e "s/search_text/replace_text/"

Danger! Be careful with this command.  Don't blame me if you mess up all your files!

Wednesday, November 6, 2013

Download data from Angular ngGrid as CSV

I have an Angular ngGrid on my page.  I want the user to be able to download a spreadsheet copy of the data.  Ideally I want the filtered copy of the data.

First - ngGrid CSV Export Plugin

First I tried the ngGrid plugin, CSV Export.  It worked fine until I threw 10,000 rows at it.  Then it would literally add time to my page load in the magnitude of 10 seconds.  Also, once I got over about 6,000 rows, clicking the CSV Export button would cause my page to crash.  (Using Chrome 26)

This technique relies on data urls.  Read more about those here:

This is no offense to the creator(s) of the CSV Export Plugin, just my experience as of this writing.

It's possible that a data URI will still work as per this example:

Second - download hyperlink

The second option I tried was to create a hyperlink that would download the data by calling a service (provided by node.js).  This is a fairly simple option, but it requires a duplicate service call to the one that loaded my grid in the first place; requires that I edit the service to support returning CSV; and will not give me the filtered rows of the grid.  Anywho, I added this to my html:

<a ng-href='/history/{{qs}}'>download spreadsheet</a>

And in my route (node.js):

exports.history = function(req, res) {
'Content-Disposition' : 'attachment; filename="a.csv"',
'Content-Type' : 'text/csv'

res.send("\"a\",\"b\"\n1,2\n3,4"); // Hard coded example.

This works, but I'm still not really satisfied.

Third - mini form submit

This is a cool idea I found based on a stack overflow post.  Basically, form submit JSON from your $scope to a service.  This would be similar to the second option, but it uses a form in the html instead of a hyperlink. Also it is a post, so that you are actually posting the JSON data to the back end, convert the data there from json to CSV (using json2csv).  Then return the data as an "attachment".  You can't use the node/express "attachment" function since that takes a file, but you can mimic the attachment function by setting the headers yourself (also shown in the second option above).

This option works really well.

Here's the HTML:

<form action='/json2csv' method='post'>
<input type="hidden" name="json" value="{{mydata}}" />
<input type='submit' value="3. Download Spreadsheet"/>

And here's the server side node code:

exports.json2csv = function(req, res) {
var jsonObj = JSON.parse(req.body.json);
var fields = Object.keys(jsonObj[0]);
"data" : jsonObj,
"fields" : fields
}, function(err, csv) {
if (err)
'Content-Disposition' : 'attachment; filename="a.csv"',
'Content-Type' : 'text/csv'

Thursday, October 17, 2013

Angular ngGrid watch for filtered row changes

I have a grid and a chart.  When the user changes the grid either via a filter or a sort, I want the chart to automatically update.  Aside from writing a "converter" to convert the model data from the grid format to the chart format, I needed to get my "watch" right.  Get it wrong and you will get this error:

Error: 10 $digest() iterations reached. Aborting!

I don't like that error.  That means something is happening recursively, like your watch is triggering a function that changes the model which you are watching and the watch fires again!  Recursive nightmare!

Skip ahead to the EDIT section.

OK, aside from that, what do you have to watch on an ngGrid to see if it "changed".  Well, the $watch method allows you to pass a string name of the model object to watch, or you can pass in a function!  Watching a huge object is time consuming!  It's better to watch a number or a string!  (Angular will be comparing the "old" and "new" values quite a lot.  If the comparison is easy, your app will be faster).

So I implemented this code to watch the ngGrid filteredRows object and a few others to see if the filtered data in the grid has changed.  This works great.  The one thing I don't like about it is that I am dependent on these variables in the ngGrid.  What if in the future the variable names changed?  I would much prefer a built in method to see if something has changed, but maybe in the future.  Here's the code.  Leave a comment if you think there is a better way!

$scope.gridChangedIndicator = function(scope) {
var watchString = "";
try {
watchString += scope.gridOptions.ngGrid.filteredRows.length;
watchString += scope.gridOptions.ngGrid.lastSortedColumns[0].field;
watchString += scope.gridOptions.ngGrid.lastSortedColumns[0].sortDirection;
} catch (err) {}
return watchString;

// When the grid data changes, the chart needs to change too.



This code was killing me.  This watch was sending angular into infinite-watch-nightmare-evil-glare.  I found out that using watch in a controller can be dangerous.  So I changed the above code to this:

$scope.$on('ngGridEventRows', $scope.refreshChart);

This event gets fired a little more often than I want, but my chart refresh is really fast.  I'd rather have it fire a few extra times than a million extra times.  This documentation about ngGrid Events is really helpful.  As of this moment, I think the events are a little lacking, and some do not seem to fire at the correct time for me.  (For example, the ngGridEventFilter event fires before the filter is really done.  ngGridEventSorted fires after which seems more useful.


OK, I'm still getting CPU burning infinite loops.  It all seems to be within the angular digest looping.  I wish I knew why this was, but I've burned up enough hours trying to do this.  I simply added a button to update the chart.  That only took 2 minutes because Angular is awesome!  PLEASE If anyone else is doing this effectively out there in the great internet, leave a comment to tell me how you did it.

This is all in flux right now.  The issue is still happening even decoupling the chart from the grid.  In a way that is good news, but now I need to figure out what is killing the page.

Enough edits already?  I was being hit by a double whammy and it exhibits itself in the same code ($digest).  Indeed, when I was adding a $watch more than once in my controller, it was causing really bad performance in $digest.  But also, Angular grid has an issue clearing the search filter.  It goes nuts in the $digest.  It is issue 777 on github.

Tomorrow for EDIT 5, I'll put the grid and chart back together and simply disable filtering on my ngGrid to see if that will solve it.  Good night!


ngGrid has a bad issue when the search filter is cleared out.  See issue 777 on github.  That was causing issues in $digest.  But I also had a call to $scope.$watch that was happening more than once in my controller, which is also bad and causes a very similar issue.  It turns out the $on code above is called quite often... more than necessary.  So I reverted back to the $watch code instead.

Wednesday, October 16, 2013

sqlite3 performance

I've been tweaking sqlite3 performance today.

Some things I've learned:

  1. Sequelize is awesome, but it does not use bind variables.  Adding bind parameters to my SQL increased the speed a lot.  I'll use Sequelize for most of my stuff, but when I query a 1 million row table, I will just use node-sqlite3 because I can use bind parameters.
  2. Indexes are huge for sqlite.  However, sqlite will only use one index per table in the SELECT query.  I have a view with maybe 4 tables, so luckily I can use one index per table.  Sqlite does not always choose the best index tho.
  3. Using the sqlite3 command line with .timer on, .explain on, and explain query plan before your SQL statement are huge!  You want to be doing a search on an index, NOT a scan.

BTW, here's an example of the bind parameters.

var sql = "SELECT blah ";
sql += "FROM `mytable` ";
sql += "WHERE finishAt > ? and projectId=? and CustomerId = ? ORDER BY finishAt desc LIMIT 10000";

db = new sqlite3.Database('../mydb.sqlite', function() {
db.all(sql, ['2013-09-16',1,1], function(err, rows) {
if (err)
console.log("Ran sql " + sql);
console.timeEnd("timestamp ");
// Force a single returned object into an array.
rows = [].concat(rows);
console.log("Got " + rows length + " records.");
if (rows.length > 0) {
console.log("Got the rows 0] " + JSON.stringify(rows[0]));
console.log("Got the rows ...");



Saturday, October 12, 2013

sqlite in eclipse

I was able to download a JDBC driver for sqlite.

I put the jar file in my /opt/eclipse/plugins dir.

Then I followed the instructions to set up the driver and the database connection.

I found that I could not create the connection by right clicking "Database Connections".  It did not allow me to put in a connection NAME!  So I had to use the icon to create the connection.

Anyway, the default database name is "main" in sqlite.  In the Data Source Explorer, you literally see nothing when you drill into ANYTHING, but that's ok!  You can still run SQL which is the most important part!  And if you need to know the table names run:

SELECT name, type FROM sqlite_master WHERE type in ('table', 'view')

pragma table_info(customers);

Happy days!

Friday, October 11, 2013

Sequelize sqlite query taking a long time.

I had a pretty simple select statement.  It was taking like a few ms when submitting in the sqlite3 command line utility.  Using Sequelize, it was taking 2.5 seconds.  I found that you can choose to use "raw" data, meaning you get Objects back that you can transform into JSON instead of getting "Models" back.  It can take some time to convert all of the records returned from the database into Models.

Here's the documentation on raw: - search for "Raw queries"

Since I started using raw for certain things, my performance has improved a lot!  Sweet!

Thursday, October 3, 2013

Node.js POST service to a database

I just created a web service that I can use to POST data and insert the data into a database.

Here's the code from my server.js, which is an express app:

var history = require('./routes/history');
...'/history', history.create);

I have a route called history.js which services this route:

exports.create = function(req, res) {

console.log("Create the history");
// Sequelize does the db insert here.;

This is a very crude example with no real error handling, but you get the point.  The variable "req.body" has the "post data".  Now I can submit data via curl like this:

$ ./
+ curl -X POST -H 'Content-Type: application/json' --data @hist.json

I have a separate file in the same directory called hist.json which has a json object of the same model as my Sequelize table.

In the node application log you will see:

Create the history
[90mPOST /history [32m200 [90m17ms - 2 [0m
Executing: INSERT INTO `BaseHistories` (`blah1`,`blah1`) VALUES ('a','b');

This blog post assumes you have node, express and sequelize set up.  Happy databasing!

Wednesday, October 2, 2013

ShellEd vs ColorEditor Eclipse Plugins

I really like the ShellEd eclipse plugin, but it has killed my productivity.  I just opened a bash file and went to get coffee.  When I came back, it was still not open in the the editor (hourglass).  Bummer.  So I uninstalled the plugin and reinstalled.  Same issue.

I just installed a different editor for eclipse,  The ColorEditor install was really easy and now I have syntax highlighting for 100+ file formats.  Sweet.  I don't do a lot of shell script editing, so just having syntax highlighting is fine with me.

Thursday, September 19, 2013

ng-grid review

I really, really wanted to like the ng-grid   When you are using angular, it definitely gives you a lot of functionality that you would have to build in your own ng-repeat directive.  Actually, it is pretty easy to set up too.

Unfortunately, I think jqgrid has it beat.  jqgrid has some pretty awesome sorting and filtering functionality.  ng-grid is lacking the column 'auto' width.  It does not sound like a big deal, but if you don't know what your data widths will be, it is a pain.  I don't want to specify each width of each column.

EDIT:  I changed my mind!

I came up with a pretty cool solution which computes your column widths for ng-grid!  Includes jasmine unit test FTW!

Thursday, September 5, 2013

Perl cfg file processing

This is some pretty awesome code I wrote a while ago.  I had to share.  I removed the code that reads the config file into an array, @cfgList, which is trivial.

    #-- Load cfg file into a hash --#
    # How the regex's work:
    # Ignore/blank out any comment lines.
    # Remove any beginning space.
    # Remove any trailing space.
    # Capture the first word as the hash key, look for space = space, then
    # capture the words in between the single quotes as the hash value.
    my %cfgHash = map {
    } @cfgList;

This will load a config file into a hash.  Here's a sample cfg file:

# -----------------------------------------------------------

# Read input from here:
inputDir = '/some/dir'

# Processing dir
localDir = '/other/dir'

Saturday, August 31, 2013

Github push to upstream 403 error with https

So, It's not always easy to set up a project with EGit for Eclipse, but once you have it set up, you are happily rockin' away.

I have not set up the ssh keys yet, so I want to use https.  For some reason when you add the remote with EGit, it does not properly add your github user name to the configuration.  So I have used this as a workaround.

Here is the error:

$ git push -u origin master
error: The requested URL returned error: 403 Forbidden while accessing

fatal: HTTP request failed

Here is the fix.  Note that I have added my github user name and the at symbol.

$ git remote add origin

$ git push -u origin master
Counting objects: 16, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (8/8), done.
Writing objects: 100% (16/16), 1.47 KiB, done.
Total 16 (delta 1), reused 0 (delta 0)
 * [new branch]      master -> master
Branch master set up to track remote branch master from origin.

Negatives of this approach: You have to enter your password each time you push.

Tuesday, June 11, 2013

Linux find a file with wildcard directory and file name

I needed to find a directory which was deeply nested and I did not know the exact name of the directory.  Also I needed to find a file in that directory also with a wild card.

Here's my solution:

find /data -name "*crazydir" -type d -exec find {} -name "*crazyfile*" \; 2>/dev/null


Tuesday, April 9, 2013

Tuesday, March 19, 2013


I'm working on parsing some XML files today.  I'm learning more about XPath which is really fun.  It kind of reminds me of regex for string parsing.  You *could* write your own loops to process XML or strings, but xPath and regex are much more ... fun(?).  Plus, less is more right?  If you can get what you need with one easy to understand line of code as opposed to several lines... great!  XPath is nice and readable, but not as fast as SAX.

Here's a quick reference to XPath:

Sunday, February 24, 2013

Eclipse Open in New Window

Two posts in one day?  Crazy Jess!

So, I sometimes discover things by accident, and this is one of those times I clicked something weird and discovered coolness.  In eclipse you can right click the Project Explorer and select Open in New Window.  An entire new Eclipse workbench opens.  OK, so not a huge deal right?  Well, you can have "Data" perspective open in one window with all your hottest, awesomest SQL's ready to fire away, and a second window open with all your nasty java code.  Life is good.  I was tried of switching between perspectives and trying to find my SQL file.

Javascript alert is bad


Just a quick post.  I just decided to NEVER use the javascript alert function for debugging again.

  1. It's a pretty crude method of debugging.
  2. If you get an alert in a loop, get ready to close 100 pop ups.  (Some browsers are cool about this)
  3. Oops, I forgot to take it out when I deployed to the test system.  (Like I just did)
  4. There are probably other reasons, like what happens when you test your code outside of a browser?
Anyway, I already have a wrapper around console.log() that will be graceful if console.log is not available, so I need to remember to use that instead and forget about alert()!

Monday, February 4, 2013

Java private vs protected as a default for members

I was thinking today about an interview I recently sat in on.  The candidate was asked if they default their java member variables to protected for private.  He said private.  I have some other friends who advocate that too.

I have always been more in favor of protected.  I like to use inheritance and having protected members allows me to access those members directly.  This was in the context of a video game where I had a base sprite class with simple things such as x,y coordinates and an image.  Then I had several different types of subclasses like animated sprite and maybe even something specific as a monster class.  Having protected members worked well.

I think the answer to this question is really that "it depends".  I had a thought recently about lazy loading and protected memebers.  If your base class has a protected member that is lazy loaded (null until the getter is called), then if you try to access it in a subclass, it might be null!  If the member was declared private you would be forced to call the getter and everything would work out all right.  So if you are writing a database application or anything else where you use lazy loading, private is the way to go.

Tuesday, January 22, 2013

vncserver with copy and paste: vncconfig

Super happy day.  My VNC woes are hopefully over.  I've been using Exceed onDemand from Windows to Linux for a while.  Now that I am running Linux on my laptop, EOD is just not working well.  I've had keyboard issues that I could not solve and copy/paste issues.  I kicked EOD to the curb.

Now what I do is:

  1. ssh into the linux box.
  2. run vncserver.  This will output a screen number for me like this:
  3. Then I go to my laptop and run vinagre.
  4. I connect to
  5. Vinagre is pretty nice.  I can go in and out of full screen pretty easily.  (You have to move your mouse to the top middle of the window to get the options window to get out of full screen)
  6. OK, so this is all working pretty well, except I could not copy and paste.  Pretty dumb.
  7. So I googled the issue at hand and didn't find much.  Then I googled the vncserver man page and found vncconfig.  Go figure.  You have to run this command to get copy/paste working:
  8. vncconfig -nowin &
  9. Run that in a command line window in Vinagre or in your original ssh.
  10. Now you can copy and paste like a champ!  It's awesome because it is a true shared clipboard.  You don't have to use a mouse middle button.  Seamless baby!

Even better, put your vncconfig -nowin & in your ~/.vnc/xstartup file.