Saturday, November 9, 2013

A Simple HyperSQL Database Server

I'm in the process of rebuilding one of my development boxes at the moment, and when I needed a SQL database engine this weekend to support a bit of quick prototyping, I decided I would take a shot at setting up a HyperSQL database server. This is something I've been meaning to look into for other reasons for a while: HyperSQL is small, fast, flexible, provides good SQL coverage (better, for example, than Apache Derby), and has the ability to tune its behavior to more closely mirror other SQL engines.

Here is a quick recipe for getting a basic server up and running. This should work equally well on both Mac OS X and Linux (I have used this basic recipe on both platforms).

My system already has a Java 7 JRE/JDK installed, which is really the only prerequisite. For the sake of simplicity, I install HyperSQL in /opt/hsqldb and I keep my database-related files in ~/hsqldb-catalogs/. Commands entered at the shell prompt are shown below starting with $, while commands entered in the HyperSQL CLI are shown below starting with >. Note that some of these commands may appear wrapped below; they should each be entered on a single line when used.
  1. Download the latest version of HyperSQL (version 2.3.1, as of this writing) from hsqldb.org.
  2. Unzip the downloaded file (assumed to be in ~/Downloads/):
    $ cd ~/Downloads
    $ unzip hsqldb-2.3.1.zip
    $ sudo mv ./hsqldb-2.3.1/hsqldb /opt
    $ rmdir hsqldb-2.3.1
  3. Create the folder needed for your database files:
    $ mkdir ~/hsqldb-catalogs
  4. Create the properties file that provides basic information for the server, including our test database. Save this file as ~/hsqldb-catalogs/server.properties:
    # server.properties
    # =====
    #
    # HSQLDB configuration file
    # Databases:
    server.database.0=file:test/test
    server.dbname.0=testdb

    #
    # =====
    # Other configuration:
    # Port
    server.port=9001

    # Show stuff on the console (change this to true for production):
    server.silent=false

    # Show JDBC trace messages on the console:
    server.trace=false
    server.no_system_exit=false

    # Allow remote connections to create a database:
    server.remote_open=false
    #
    # =====
  5. Create the configuration file for the CLI HyperSQL tool, describing two different connections to our test database: one as the default system administrative user "SA" and one as a normal administrative user "RON" which we will create in a few minutes. Save this tool as ~/sqltool.rc:
    # =====
    # Connect to testdb on localhost as SA (no password initially):
    urlid localhost-sa
    url jdbc:hsqldb:hsql://localhost/testdb
    username SA
    password

    # Connect to testdb on localhost as RON:
    urlid localhost-ron
    url jdbc:hsqldb:hsql://localhost/testdb
    username RON
    password strong-random-password
    # =====
  6. Start the server. This will create our test database (a "catalog" in HyperSQL parlance) using the following commands in a terminal window:
    $ cd ~/hsqldb-catalogs
    $ sudo java -cp /opt/hsqldb/lib/hsqldb.jar org.hsqldb.server.Server --props ./server.properties
  7. Open a second terminal window and start the HyperSQL CLI, connecting to our test database as user "SA":
    $ java -jar /opt/hsqldb/lib/sqltool.jar localhost-sa
  8. As a general rule, I don't like signing in using the system administrator's account, so the first thing is to create an administrative user account for my use:
    > create user RON password 'strong-random-password';
    > grant DBA to RON;
    > \q
  9. Make sure you have your "strong-random-password" recorded correctly in ~/sqltool.rc, and restart the HyperSQL CLI:
    $ java -jar /opt/hsqldb/lib/sqltool.jar localhost-ron
You're done. You have a fully-functional database server, providing a single database called "testdb" for your use.

If you want to play with sample data, consider the following (while connected as either "SA" or your administrative account):
> create schema sampledata authorization DBA;
> set schema sampledata;
> \i /opt/hsqldb/sample/sampledata.sql


When you're ready to shut down the server, you can do it while connected to the server as a user with DBA role:
> shutdown;

Alternatively, you can shut down the server from the command-line:
$ java -jar /opt/hsqldb/lib/sqltool.jar --sql 'shutdown;' localhost-ron

Some final thoughts:
  1. Read the docs: the user guide and the utilities guide for HyperSQL are good, and provide a fairly thorough coverage of what HyperSQL is capable of and how to use it, including connecting via JDBC.
  2. You almost certainly want to set a password for the default "SA" user, and remember to update your ~/sqltool.rc file when you do.
  3. Protect ~/sqltool.rc -- obviously this file, with information about catalogs, users, and passwords needs to be protected.

Sunday, October 27, 2013

Recap: Adobe ColdFusion Summit 2013

I'm going to try to capture my thoughts on Adobe's recent inaugural ColdFusion Summit 2013 conference. You'll find lots of my own opinions here and a bit of rambling. There may or may not be much of value to you as a reader... continue at your risk.

Overall Impression

For the first year of a new conference, I felt like the conference organizers did a decent job of pulling it off. Acceptable facilities, good food, good reception, good WiFi, good breadth and depth of topics to choose from and delivered by a combination of both known and previously-unknown (to me, at least) speakers. The conference itself had a few rough spots (more on that below). I came interested primarily in hearing Adobe's position on ColdFusion's future and walked away convinced that at least the Adobe CF team itself is convinced CF does have a future. I remain concerned about certain aspects of that future and not entirely convinced in the CF team (but that's a subject for a separate blog post).

Facilities and Conference Stuff

I should preface this with the statement that I am not a "Las Vegas person" -- very little about Las Vegas itself is appealing to me. I'm also not crazy about big resort hotels, and the Mandalay Bay Resort is nothing if not that. I would be perfectly happy in a smaller conference facility somewhere else.

Signage is definitely not the resort's strong suit, and we spent 45 minutes wandering around Wednesday evening looking for the early check-in location and asking people that seemed to be associated with the hotel and/or convention center where it was before someone finally pointed us in the right direction, only to find out they didn't have our t-shirts so we gave up and went to get dinner and just waited to check in Thursday morning. Check-in itself on Thursday morning was simple, smooth, and fast. If you are going to offer some sort of early check-in, make it easy to find and have the stuff ready, or don't bother.

The session and keynote rooms were sufficient but only barely so: sub-par audio, small screens, and poor room layout all made for challenging settings for both speakers and listeners. I've had to present in rooms like those, and it is challenging at best. I did like the classroom sort of setting where everyone was sitting at tables, and the decent WiFi coverage was a pleasant surprise. Only one of the rooms had power at the tables; it would have been nice for that to be more widespread. Having all of the presentation venues very close to each other was a plus.

At several points in both the opening keynote and the closing comments delivered by Adobe reps, they touched on having to cap attendance at 500 and how that number was significantly above what they originally had planned for. I left wondering if the juggling to accommodate a larger-than-initially-anticipated group of attendees resulted in some juggling on the facilities side and contributed to room layouts, etc., that left quite a bit to be desired.

I appreciated the single-sheet agenda handed out at check-in, although I would have liked for the single-sheet agenda to have included the names of the speaker(s) for each session. The conference Web site was acceptable, although I never did figure out why they wanted us to set up a schedule there... and the session-scheduling part of the Web site was pretty clumsy. I couldn't install the conference app on my phone, but no one I talked to using the app on either Android or iOS was impressed with the app or even had much positive to say about it.

In general, the length of the sessions, breaks, and the overall length of each day was about right. Very few of the sessions I sat in on, however, left any time for questions at the end; this may be something the organizers and content committee should touch on with presenters in the future or extend the sessions slightly to accommodate questions. (I would be curious to hear opinions from the presenters themselves on this item.)

My impression is that the conference organizers hadn't made plans to either record any of the presentations or provide a central place from which presenter slide decks could be downloaded; the second of those two was a particularly surprising decision or oversight.

Some of the stumbles were surprising given that Adobe has been involved in conferences larger that this for many years. If there are to be subsequent versions of this conference -- and it seems pretty obvious from the closing wrap-up that the intent is that there will be -- and if the conference organizers can retain the generally high quality of most of the content and if the conference organizers can address some of the facility-related shortcomings, this could be a very good conference in the future.

Session Recap

A few thoughts on the sessions I attended...

Day 1 keynote (Ben Forta, others): Disappointing. Ben Forta started with a quick look back at the history of CF and at some of the successes and failures in CF along the way, talked a little about where he sees things headed, that CF has now been dead for more than half its life and continues to grow and evolve in spite of being dead, and finished with a few of his own opinions on what CF's strengths are and where it should be headed. Ben gave way to a couple members of the Adobe CF team and the keynote -- for a keynote, but particularly for a day 1 keynote of a new conference -- went off a cliff. It turned into a detailed demo including code, etc., that felt like it had largely been thrown together and had not been rehearsed or screened to fit into the allotted time. This was not opening day keynote stuff, either in content or in presentation. More concerning to me, though, was the contradiction between BF's take on where things should be heading... immediately followed by lots code and demos of the CF team's CFCLIENT efforts. I'm not a fan of the CFCLIENT idea, and nothing I heard in sessions or follow-up blog posts has convinced me that CFCLIENT and the effort/resources Adobe is pouring into it are a good choice. To have that contradiction so visible within the content of the opening day keynote reinforced my own concern that Adobe and the CF team don't see the importance of some of the "soft side" of CF: the value of communication and the importance of getting the message right. I've increasingly felt that way for the past couple years and this conference repeatedly reinforced that concern. The keynote itself suffered greatly from the combination of terrible acoustics, high A/V system volume, poor physical layout, small screens, and speakers with accents speaking way too quickly. There were portions, even during BF's portion, where I simply could not understand what he was saying and it was even more challenging with the other members of the CF team. In watching the #CFSummit2013 twitter feed during and after the keynote, I know I'm not alone in those concerns or in my take on the keynote.

Language enhancements in ColdFusion Splendor (Vamseekrishna N of the Adobe CF team): Poor. Illegible slides (yellow text on white background rendered at a small size, anyone?), almost entirely a repeat of stuff from the keynote (which admittedly should not have been in the keynote in the first place), poorly organized and presented content, and struggles with tools. (Seriously? A CF team member that doesn't know how to change the font size in Adobe's flagship CF development tool?) Poor audio, bad room layout, small screens. In terms of the changes coming in the language itself, I see some things I like: member functions, something closer to full coverage of the language's capabilities in CFSCRIPT, better/more complete JSON handling.

Java Integration (Dave Gallerizzo): Good intro presentation to what you can do with Java both in using existing Java libraries and in invoking CF stuff from within Java. Good presenter. Nothing really earth-shattering or eye-opening for me there.

What's New and Different in ColdFusion 10 on Tomcat (Charlie Arehart): Very good. Charlie is one of the rare speakers with both extraordinary knowledge and the ability to share that knowledge effectively in the setting of a conference presentation. I've used Tomcat for the past couple years outside of just in ColdFusion's stack, but even so there were a couple of nuggets from his session that were new and potentially valuable to me.

Advanced OO in ColdFusion (Scott Stroz): Very good. Scott's usual blend of expertise and humor made for an informative and enjoyable session. Despite being assigned to a huge room, Scott has enough presence, clearly is a sufficiently experienced presenter, and structured his slides in a manner that the setting itself did not detract from the presentation.

Preview: ColdFusion Splendor (Rupesh Kumar): Good. Lots of overlap with the technical detail already covered in the keynote and in the earlier session on coming language enhancements, but I felt like Rupesh did a much better job of covering the relevant content at an appropriate level and in an understandable manner. Realistically, though, I think the CF team would have been better served by having this be the primary session touching on what's coming in the next version of CF here and avoiding the duplication of content in at least three sessions. One slightly surprising thing that came out of this session was when I asked about approximate release dates for the next version, Rupesh's response indicated that it would be "sometime next year", representing a pretty significant slip from earlier communications.

Reception: Good. Good setting, free drinks, and decent appetizers for a couple hours in one of the restaurants in the conference resort. Great setting to just sit and talk, as well as to circulate. Enjoyed a short conversation with Elishia Dvorak of the Adobe team, and appreciated the fact that someone from the Adobe team was making an effort to find out who was at the conference and what they were working on.

CFHour session (Scott Stroz and Dave Ferguson): Really poor. I've listened to their podcast for the past few months and typically enjoy (or sometimes just tolerate) their mix of information, opinions, commentary, and humor. This one seemed to take a big turn south when it became known that they were not going to be able to record the session. Any apparent attempt to provide value seemed to go out the window at that point. I should have just gone and gotten another beer. Funniest part was Jason Dean's brief appearance. Some interesting back and forth with the audience when the subject of the recent security event involving someone taking copies of source code for Adobe products was raised, but still no word of any sort from Adobe or the CF team on what was taken, what they are doing in the aftermath of the event, or whether they believe this event represents any sort of additional security concern for CF users.

Side note: Earlier in the day, I had stopped to ask a couple questions of one of the CF team members at their table set up in the hall. I posed the question of when Adobe would be making further information available. The team member immediately turned to look at one of the more senior team members who came over and recited almost verbatim the line that as long as I didn't have the CFIDE stuff publicly available and had my server locked down, there were no known risks. This isn't even the question I asked... so clearly they had been directed to simply recite this line and try to move on. When I asked a second question about support for Apache 2.4, I got a shrug and a "we're not sure". OK, then... oh-for-two. Time wasted; get some lunch.

Day 2 keynote (Avi Rubin): Good. Interesting discussion of software and information security. Not specifically CF-focused, which generated some Twitter traffic, but still absolutely relevant to those of us working as within the application development space. Very enjoyable change of pace.

NoSQL (Dan Wilson): Very good. Really approachable introduction to NoSQL through MongoDB. Good breadth and depth for an intro. Good mix of technical content, experience anecdotes, and Dan's dry sense of humor. I've never played with a NoSQL back-end for data storage, but after sitting through this session, I plan to look hard at it for certain portions of the less relational information we're collecting and maintaining in some of our applications. Great slides, even for code content and even taking into account the poorly laid-out presentation space.

Closures (Adam Tuttle): Very good. A great introduction to the concept. Really liked that his preso was based on Reveal.JS and was available for us to follow along with on our laptops while he went through it, completely eliminating any concerns about visibility/legibility of code snippets, etc. I've used callbacks and anonymous functions in jQuery more than in CF, and even occasionally some simple closures in JavaScript, so very much enjoyed digging a little deeper into this. Very much liked Adam's preso deck and appreciated his understated approach in presenting.

Amazon Web Services (Brian Klass): Very, very good. Probably the most eye-opening session for me of the conference. Brian did an incredible job of cramming a very broad and very deep subject into an hour. We've not done anything with moving toward AWS or any other cloud-based infrastructure yet but have been toying with a move in that direction for a couple of reasons. This was a real eye-opener for me as to just how different hosting in that environment would be in terms of architecting and maintaining infrastructure compared to just standing up another physical server in our current network DMZ. Great slides in terms of content and legibility, and a very knowledgable, polished presenter.

CF911: Server Troubleshooting (Charlie Arehart): Very good. Typical Charlie Arehart presentation: great content, great slides, great delivery. As has always been the case when I've attended one of his sessions, I walked away with several specific items I will go dig into. I'm convinced Charlie is one of real gems of the CF community with his experience, knowledge, ability and willingness to teach and share, and his personality.

Advanced Caching (Rob Brooks-Bilson): Good. I figured this would be sort of a throw-away for me, as I sat in on it simply because there was nothing more interesting on the agenda at that point... but I have to say I was impressed. Very interesting presentation on what's becoming possible with some new products to enable caching on a huge scale. Not really relevant -- at the huge scale discussed in Rob's presentation -- to my current work, but it was interesting enough to keep me awake (despite being exhausted) and convinced me that I do need to look into caching at least a little bit.

Closing remarks (Adobe rep; no idea who he was and he didn't introduce himself): OK at best. Nothing earthshaking or new to wrap things up. Reiterated that there is growth in the Adobe CF customer base (at least when measured in new customers) and he emphasized that this was new customers, not just new license sales. Also touched on Adobe's efforts to support the CF community through their and others' education initiatives.

Summary

The conference itself came off reasonably well and has potential to be even better in the future. I certainly came away feeling like attending was both time and money well-spent. Some very, very good presentations with real value for me and my team. I continue to be concerned about Adobe's management, support, and vision for the evolution of the CF product, perhaps less so than 6 months ago, but still concerned. I'll write more on that in the near future.

Wednesday, October 2, 2013

One thing I know

If I have learned anything in this first half of my life, it is this: there are few things in life more satisfying than a well-constructed sand castle. And when your daughter helps you build it, the list gets even smaller.

Thursday, September 12, 2013

(untitled)

Oh, sweet nectar of life: few things taste as good as this first cup of home-brewed coffee after a 42+ hour fast! That peanut butter and honey sandwich was a distant second.

Wednesday, September 11, 2013

New entrants in the Chrome OS space

Big news today on the Chrome OS front: three new Chromebooks and a new Chromebox, all based on the Intel Haswell architecture. I'm interested in these on two different fronts:

  1. A possible replacement for my aging-but-still-plugging-along dev channel Cr-48, and
  2. A replacement for my aging Gateway desktop, which is currently running Ubuntu Linux and which we only turn on when we need to print something or when Li wants to play some of her PBSkids games or use ChessKids.com. That system seems ripe for replacement with a Chromebox, perhaps combined with a networked printer supporting Google's CloudPrint.

Lost in translation?

As a follow-on to my earlier post related to prematurely-expired chart images in Adobe ColdFusion, I came across this link: http://www.elliottsprehn.com/cfbugs/bugs/86743

The bug is clearly related to what I am wrestling with, although this bug appears to be more related to orphaned image files piling up. What caught my eye initially was the comments about how at least one of the settings in the CF Admin UI was basically useless the apparent lack of documentation of this configuration file.

Out of curiosity, I wanted to see if this particular bug was ever resolved -- curiosity driven in large part because I am wrestling with this same sort of problem and could easily see this bug (if still unresolved) impacting my servers as we play with the configuration settings to get past the expired chart images.

Guess what? This bug does not appear to exist in the Adobe ColdFusion bug tracker in any form... no matter how I search, I can find no reference to it in their current system: not by content, by author, by ID, or even by date. I can find bugs prior to this and following this but not this one. Which raises a really interesting question: how many other issues like this, that a user of their products troubled themselves to investigate and document, got lost or dropped?

I don't really know what Elliot's system is drawing information from, but it seems like it might be worth Adobe taking a second look at what might have gotten lost.

Monday, September 9, 2013

Changing perceptions

"For folks who are concerned about putting your digital life in the hands of PPL connect, all transmissions to and from its servers are encrypted. And, the company is currently devising a fully encrypted system whereby the data's only accessible with a single, user-owned key."

From an article on Engadget, but the source really doesn't matter. And you can change the words "PPL connect" to pretty much anything you choose.

My point is that this certainly does not carry the same weight today that it might have a few days ago.

Sunday, September 8, 2013

Sounds

Sitting by an open window this evening, listening to the low rumbles of distant thunder as a late summer storm rolls through on one side and the trickling water of Deb's fountain on the other, and enjoying a couple of finds from the music Ian left us, thinking we might like it: Robert Miles' "Dreamland" and Kruder & Dorfmeister's "G-Stoned". The K&G, in particular, is both a little surprising (but not a lot, given Ian's tastes in music) and pleasing.

(untitled)

Rough day...

Saturday, September 7, 2013

Just wondering...

Deb and I spent yesterday morning on our (typically) twice-per-year task of washing down the outside of our house. Deb almost always couples this task with washing all of the windows, inside and out. The only really enjoyable aspect of this 4-hour job is that it is a good excuse for me to get out the power-washer, but other than that, it is a pretty lousy way to spend a Friday off from work.

Every time we do this, some variation of the same question comes to mind: why do we never see anyone else washing their house? I can't believe we're the only people who do, and although I've never walked right up to a statistically significant sampling of homes in our neighborhood to see how dirty they are, I have to believe other home-owners (aside from those "responsible" for the nearly-abandoned house just to the west of us) have some means of keeping their house clean.

Thoughts?

Stuff Li says

Li doesn't like licking envelopes to seal them. She says they "... taste like couscous."

A surprising highlight

Li's U-8 girls team played their first match of the fall season this morning. The end result was, in truth, pretty ugly (a 9-1 loss) but one of the real highlights came from Li. She played the third "quarter" (yes, quarters in soccer: they split the match into quarters to facilitate substitutions) as goalkeeper, and kept a clean sheet. She managed to grab a couple balls, and blocked a couple others.

This from the player who, before the season started when she found out we had a goalkeeper on the field at this level, was pretty emphatic that she did not want to play in goal.

Friday, September 6, 2013

Wrestling with expired chart images in Adobe ColdFusion

TL;DR

Change the timeouts specified in webcharts3d.xml

Long Version

We are in the midst of migrating several existing Web applications to Adobe ColdFusion 10 (ACF10) as part of a server consolidation effort at work. One of these apps has a page that takes a Web-eternity to generate (over 30 seconds, in some situations based on load and other what the user has opted to include) based on the sheer volume of data that has to be digested, and includes several charts created by invoking cfchart. For years, we've struggled with situations where the chart images in this particular report don't show up and instead are replaced with placeholder images indicating the image has "expired". In spite of playing with the limited configuration options related to chart caching in the ACF administrator UI, we've never really found a way to ensure the chart images don't expire. (And, yes, we've invested a significant amount of time to improving the efficiency of the logic behind the report and improving performance of the underlying queries used to retrieve the data.) I think we finally made a bit of a breakthrough on this front yesterday.

I was wrestling with this problem on my development server, which although reasonably fast, is significantly slower than our staging server where we are testing this particular app. Because my dev server is slower, the problem with these expired chart images was exacerbated and I was consistently getting this particular report back with all of the images expired: good for testing and troubleshooting, but useless in terms of the report itself. The ACF admin UI provides only two configuration options that seemed relevant to solving the problem: the size of the cache (how many chart images will be cached) and whether the chart images are cached to disk or to memory. The cache size on my dev server was configured at 200 charts, but given that I am the only one who can connect to my dev server, it made no sense that the half-dozen charts on this report could be expiring based on the cache being filled quickly enough to cause them to be expired. They had to be expiring based on something else... age, perhaps?

I verified this by switching the cache to be on disk and watching the content of the folder being used for the cache. In spite of the specified cache size in the ACF admin UI, there were never more than a handful of image files present in the folder. Something other than the number of chart images was at play here.

A quick search of the ACF bug list yielded nothing relevant in terms of a known problem. Searching the ACF forums brought up a couple other topics related to expired images but nothing that on the surface seemed really relevant to what I was seeing. The entry that seemed most relevant was in a clustered environment and seemed to be related to how requests got moved among the various servers in the cluster and how that seemed to be leading to expired chart images. But... in that post there was a reference to an XML configuration file for the underlying charting engine... and a timeout parameter in that file.

The ACF forum post referred to minTimeout and maxTimeout parameters in a configuration file named webcharts3d.xml. At least on my system, that file resides at {cf_home}/cfusion/lib/webcharts3d.xml. Locate the <server ...> element near the top of the file, and adjust the minTimeout and maxTimeout attributes. On my system, they were set to 5000 and 30000, respectively, which I believe to be 5 and 30 seconds. When I saw those values, I was convinced I had found my solution, as it seemed like when the report in question took more than 30 seconds to generate, I was consistently getting the expired images. Recognizing that this long-running report can take as long as a couple minutes to generate under load, I changed the minTimeout attribute to 120000 (120 seconds) and the maxTimeout attribute to 300000 (5 minutes). I saved the file and bounced ACF on my dev server, and... no more expired images. Watching the cache folder, I started seeing images pile up (as I would have expected). Problem solved.

Having said that, I still have a couple questions:
  1. What exactly do those configuration parameters in the ACF admin UI for cache size and number of threads actually control (if anything)?
  2. Is there documentation somewhere for the webcharts3d.xml file, explaining what the cache, minTimeout, maxTimeout, cacheSize, and maxThreads attributes do? Several of those have similarly-named settings in the CF admin UI but updating the CF admin UI values does not touch this file. So far, I've not been able to find any documentation on-line.
As is often the case, now that I have some idea of how to solve the problem, I can find at least half a dozen posts or articles via Google describing the solution. With this post, there is now one more.

And for the record, this just might be the first time I've found something of value in the ACF forums, but that's a topic for another post and another day.

The importance of purpose

Li started first grade this week. In her class, each of the students is assigned a job and they rotate through the jobs over the course of the school year. I love the fact that Li is so excited to have been assigned the job of "paper and pencil monitor" even though she indicates she has no idea what that job means or what she will have to do...

Thursday, September 5, 2013