Sunday, July 10, 2011

The Golden Rule of Finance

There is one Golden Rule to successful personal finance.  Spend less than you earn.  When struggling financially, there are two sides to that equation you can work with.  Either you attempt to modify your earnings (aka: sell things, get another job, get a new job, or just plain get a job).  Or you can modify your expenses (consolidate debt for lower interest, cut unnecessary bills, reduce discretionary spending, move someplace cheaper, cut coupons, etc).  You can earn less than you spend and survive, but not for very long.  Debt will bury you.

Of course we violate this rule all the time, and sometimes it works out okay.  But never for long.  Maybe you take out student loans to go to school.  Assuming you get a job that increases the earning side of the equation, you can pay that back and eventually come out ahead.  Maybe what you earn in the month of December is less than what you spend for Christmas.  Assuming you saved a little during the other 11 months, you can usually afford to do this.  Maybe you buy a house that you could afford at the time, but then lose your job or change jobs and suddenly earn less.  Assuming you have enough savings cushion, you might still be able to make your payments.  But maybe you'll have to move someplace cheaper.

There's one other piece of this puzzle I've not touched on fully.  It bears mentioning that the Golden Rule of Finance has a Silver Sidekick.  As much as you can, plan for the future.  This means you need your earnings to exceed your spending enough that you can save the excess.  Incurring debt by borrowing makes certain assumptions about the future.  Namely that what you want to have now is important enough that you'll commit to future repayment.  But the future is often unpredictable, and debt repayment has a way of highlighting that like nothing else.

Now, you can occasionally find yourself out of balance with the Golden Rule of Finance and still be okay.  The road of life is full of bumps and twists and unexpected turns.  But when trouble comes, it's how we react to it that matters.  Do we recognize the road we're on?  Do we see the imbalance when it's there or predict a future imbalance?  Are we willing to make sacrifices to modify the equation?  That last question is key, because if you find that you spend more than you earn, you cannot modify the equation without sacrifice.  Sometimes difficult.  Usually painful.  Often realized late.

That's where our nation finds itself with this debate on the debt ceiling.  The same simple equation for personal finance applies here, just on a much grander scale.  The trouble is, the federal government's earning side of the equation is not normal.  They don't earn money as much as take it or make it.  They take it via taxes.  They make it either literally via a printing press, or artificially via methods like quantitative easing.  Taking money via taxes stymies growth in the private sector by reducing consumer spending, or by affecting unemployment.  Which is a bigger problem as many taxes are tied intimately to employment.  As unemployment rises the earning side of the equation suffers further.  Making money contributes heavily to inflation.  Interestingly, inflation is somewhat good for debt because the money we borrowed in the past was more valuable than the future money we use to pay the debt back.  Of course, in every other way inflation comes at a heavy price quite literally, as future buying power is reduced.

The US has committed to spending more than it earns on all the things the government spends money on.  And borrowing to do it.  And that course is not sustainable, and that's what the current debate in congress is about.  It's not politically viable to make the sacrifices necessary to balance the equation.  Raising taxes is not popular and is potentially job killing during a time when 'economic recovery' is tenuous at best.  Cutting spending is not politically viable either as there is always some advocate on the receiving end of those benefits that will make a fuss.  Of course, the worst thing we could do is just borrow again and not change either side of the equation.  Raising the debt ceiling means that we'll just continue the borrowing cycle, which only ends one of two ways - eventual repayment (sacrifice) or eventual default (greater sacrifice).

Nancy Pelosi recently asked why the vote on raising the debt ceiling couldn't be decoupled from the vote on spending.  The question is so alarming because it shows how ignorant those at the helm are to the situation we're facing.  I recently read somewhere where someone posted that this is like a fat man gearing up to eat a trillion more donuts before promising to finally diet.  Politics are going to have to be put aside, and possibly political careers as well, and both sides of the equation are going to have to be modified if calamity is to be avoided.

Ultimately though, the sacrifice will happen.  If not in this generation, then I fear the next will be forced to pay it.  And by then the sacrifice will have accrued interest.

Wednesday, July 6, 2011

.ToString() antipattern

Nothing worse than finding .ToLower().ToString() littered throughout your codebase.  Calling .ToString() on an object that is already a String is annoying.  Don't do it. Ever.

Tuesday, June 7, 2011

Surely you got the IM/email/tweet?

Paper is dead.  Surely you got the IM/email/tweet?

It's amazing how often people need reminded of that.  I just got wrapped up in a discussion about how hard it might possibly be for some not-so-savvy people to print from their browser and that we should create a PDF version of some web pages so that our customers can easily print.  Seriously?!  I'm all for serving my customers, but that's borderline insanity.

How about a simple icon and a Javascript window.print() call?

Friday, May 27, 2011

With You

With you

The way you say my name
The sound that your voice makes
The funny way you turn a phrase
Sounds like love to me

And when I feel alone
Your tender words they calm the storm
They are simple and yet so strong
And they sound like love to me

When you speak
It's like you know just what my heart needs
Your words they bring healing to me
And help me to remember what is true
And I hear love when I am listening to you

I look into your eyes
Full of wonder and surprise
The care your gaze implies
Looks like love to me

You have seen me at my best
And wrapped up in my arrogance
You remain unimpressed
And it looks like love to me

And you see
Not what I am but what I can be
The way you look at life it is inspiring
And it helps me to remember what is true
And I see love, when I am looking at you

Every day I am amazed
That God would show His love for me this way
When you speak
It's like you know just what my heart needs
The way you look at life it is inspiring
And it helps me to remember what is true
That I hear love, when I am listening
And I see love, when I am looking
And I find love
With You

- Geoff Moore and The Distance

Happy Anniversary Beth

Monday, May 9, 2011

ByVal goes bye, bye

One of the nice enhancements to VB.NET in Visual Studio 2010 sp1 is the removal of the auto-insert-ByVal keyword.  If you don't type ByVal, then Visual Studio now respects that and doesn't try to insert it for you.  As ByVal is the default anyway, it's much nicer to omit it so that when you use ByRef, it really stands out.  A simple find and replace of your ByVal's, and your code instantly looks and feels cleaner.  Thanks Microsoft!

Thursday, April 28, 2011

Lost usernames

mattmc3 was the handle I chose back in 1997 to use AOL's instant messenger.  Since then, I've tried to sign up as mattmc3 for nearly everything I could, but unfortunately I've not been very successful.  While I happily have the e-mail addresses I want, other services like twitter, slashdot, banks, and various other places mattmc3 was already taken.  It's funny how your online handle can force you to sign up early for things that you have absolutely no interest in just to turf your username.  Interestingly, I think this is part of the reason I don't do twitter.  Not only is the content there mostly useless, the guy who got my mattmc3 username is peddling horoscopes, which I have to wonder if someone looking me up by username wouldn't mistake for me or I for him.  When some rival service (called, say, Sylvester - the hunter of Tweety birds), I'll be ready to sign up... not to use the service, but just to defend my good (user)name.

Wednesday, April 27, 2011

Digital Cable and the DVR

Our TV is an old hand-me-down Zenith from 20-some years ago.  I couldn't care less about digital cable.  But, when the cable company sent us a letter and numerous e-mails telling us that we needed a converter and that our channels were destined for digital cable, I started to worry.  I ordered the (free) converter, and waited.

It didn't take long to figure out that the converter wouldn't work along with the DVR unless I wanted to change the channel on the converter prior to every show being recorded.  Not gonna happen.  I needed an alternative.  Of course, not knowing how the digital cable thing works I had to now figure out what to do.  Are they encrypting the signal to force me to use a cablecard or their DVR?  What is ATSC? QAM? These things make no sense to me.  Thankfully, I discovered that the tv tuner card I bought has a separate port for digital cable.  Cool.  Split the cable, plug the analog one back in and use the other one on the digital line already in the card I bought and haven't touched in 2 years.  Scan for new channels and all is good.  Scary how easy this was, so crossing my fingers and waiting for the other shoe to drop.  I think I just need to re-setup my recordings.  Not bad.

Monday, April 25, 2011

String or binary data would be truncated

This, in my opinion, is the most frustrating error you can get in SQL Server.  It's the error you get when you're trying to do an "INSERT INTO (...) SELECT (...)" to push a bunch of records from a query result into a table.  If one of your (n)varchar fields is too small to hold one of the values, you get this lovely error.  It's nice that SQL Server won't truncate your fields for you, but frustrating that you get this sad little error with no details.  You don't know which field is causing the problem, and if you're inserting into a large number of fields, it's completely frustrating to figure out what caused the issue.  Especially when SQL Server could easily have done that heavy lifting for you and told you in the error details.

I've hit this error enough that I have developed a simple technique for troubleshooting this.  It's not fancy, and you could probably think of something better, but this works and gets me to my resolution in a few seconds so I thought it would be worth sharing.  The trick is to turn your "INSERT INTO (...) SELECT (...)" into a "SELECT (...) INTO _BadDataTable_ (...)" statement.  The SELECT INTO will create a new table for you on the fly, assuming that you use the AS clause in your SELECT to name your fields the same as the destination table you're looking to INSERT INTO.

From there, you can then run this simple SQL script changing the values of @prodTable and @invalidTable into the real names of the destination table and the one you made on the fly containing the bad data.  Kinda hackish, but it works and that's all I'm looking for.  Note that you have to make a real table with the fields named the same for this to work.  Feel free to take and modify to suit your needs, and here's to hoping the next release of SQL Server will fix this completely arcane error:

-- setup
    @prodTable varchar(100) = 'MyDestTable', -- CHANGE THIS!!!
    @invalidTable varchar(100) = '_BadDataTable_', -- AND THIS!!!
    @c cursor,
    @column_name varchar(255),
    @max_len int

if object_id('tempdb..#max_len') is not null drop table #max_len
create table #max_len (max_len int)

-- get schema for prod table
if object_id('tempdb..#prod_table_schema') is not null drop table #prod_table_schema
    a.table_name = @prodTable
order by 1

-- get schema for table made from invalid data with the trucate error
if object_id('tempdb..#invalid_table_schema') is not null drop table #invalid_table_schema
    cast(null as int) as character_maximum_length,
    case when character_maximum_length is null then 0 else 1 end as is_char_field
    a.table_name = @invalidTable
order by 1

-- we need to chase after the max(len(COL)) info with a dynamic query
set @c = cursor local fast_forward for
    select a.column_name
    from #invalid_table_schema a
    where a.is_char_field = 1
    order by a.ordinal_position
open @c

fetch next from @c into @column_name
while @@fetch_status = 0 begin
    -- get the maximum data length of the field from the table
    delete #max_len
    insert into #max_len
    exec('select max(len(' + @column_name + ')) as max_len from ' + @invalidTable)
    select @max_len = max_len from #max_len
    update #invalid_table_schema
    set character_maximum_length = @max_len
    where column_name = @column_name

    fetch next from @c into @column_name
close @c
deallocate @c

-- Tell me which fields have the problem
    b.character_maximum_length as actual_length_of_data
    #prod_table_schema a join
    #invalid_table_schema b on
        a.column_name = b.column_name
    a.character_maximum_length < b.character_maximum_length

Sunday, April 24, 2011

VB.NET finally gets the Yield statement

I've found myself multiple times in the last four years having to defend Visual Basic against the disgusted reactions of some of my C# developer friends.  I think most people who have to pinch their noses at the term "VB" aren't really doing it towards the language, but more towards the stereotypical perception of a what they believe to be the quality of VB developers in general.  It's easier to argue technical merits than stereotypical opinion, other than to say that good developers are versatile and objective when evaluating technologies and each other.

I admit that it took me about a week or two when I first started to swallow my pride and accept that we have a pretty extensive codebase written in VB, and there's very little compelling reason to change it.  And now, I actually really enjoy it almost as much as C#... except for lambdas... ug, so verbose.  Anyway, over time C# and VB have converged and the similarities far outweigh the differences.  If you can do it in C#, you can almost certainly do it in VB with a few notable exceptions.  In fact, since getting auto-properties and option infer, lately I've been describing VB thusly:

VB really is no different than C#.  You have all the same libraries.  It's a little bit more verbose in places, but a lot more readable in others.  Once you get used to case-insensitive languages, you'll wonder why there aren't more of them out there.  The only features you might miss are the dynamic and yield keywords, and unsafe if you ever use it (which you shouldn't be).  Actually, dynamic features have been available in VB since the beginning, but they are file scoped instead of variable scoped.  So Yield is really the only thing you'll wish you could do, but can't.

And now, at long last - with the new Microsoft Visual Studio Async CTP (SP1 Refresh), we've finally gotten a Yield keyword in VB.  This changes my whole elevator pitch.

I've installed it, and toyed with it, and it works exactly as you'd expect.  Here's the arbitrary infinite Fibonacci sequence hacked together in 30 seconds in VB:

Module Module1

 Sub Main()
  For Each i In GetFib()
   If i > 500 Then Exit For
 End Sub

 Public Iterator Function GetFib() As IEnumerable(Of Long)
  ' Forget math overflow... this is only a test
  Yield 0
  Yield 1
  Dim previous = 0, current = 1
  While True
   Dim nextVal = previous + current
   previous = current
   current = nextVal
   Yield current
  End While

 End Function

End Module 

It's only 6 years overdue, but still such a welcome addition to the VB family. I can now finally get rid of my bloated iterator classes for my OrderedDictionary and my TreeNode classes. I can start porting projects like Dapper to VB without bloating the code. Thank you Microsoft. You've made my work life so much less painful.

Friday, April 1, 2011

Google Reader Bankruptcy

It's April 1st.  Nothing to see here today that's worth anything... move along.

Monday, March 28, 2011

IT Celebrity Deathmatch?

Apparently James Gosling joined Google today, and with Anders Hejlsberg at Microsoft, I was reminded of this old claymation show for some reason.  Hmmm...

Saturday, March 26, 2011

Firefox 4... a major disappointment

Battleship grey, and a bit uninspiring
I've been a user of Mozilla's Firefox since it was first called Phoenix, then Firebird.  It's been an amazing web browser, but something has happened with this latest release of Firefox and I fear the magic has gone.  Google's release of its browser Chrome appears to have shaken things up at Mozilla, and the result is that Firefox 4 has become a cheap knock-off playing catch-up instead of the innovative leader.  Much like all the wanna-be iPads, Firefox has decided it wants to be Chrome and fails to distinguish itself.

I was part of the beta test group that used Firefox 4, and I used it on both my PC and my Mac.  The deeper into the beta testing, the worse it got.  Consider the initial look - it's really, really ugly.  The tabs are uninspiring compared to Chrome, and the battleship grey on the Mac makes me throw up a little in my mouth.  Googling for "Firefox fonts look bad" turns up a whole host of people who noticed too.  It looks pretty bad, and the solution appears to be messing around with settings in the about:config.

Another oddity is that the download size has also more than tripled.  The last version of the 3.x series was just over 8MB.  The 4.x series comes in at a whopping 28MB.  Not huge by today's standards, but certainly noteworthy in it's stark contrast from the previous trim-and-slim size.  Firefox 4 has put on some sizable love handles, without much to show for it.

The one new feature that appears to be somewhat innovative is the "tab groups".  This is sort of a task manager for your browser windows.  It seems neat at first, but it winds up being a solution in search of a problem.   Other than an initial time playing around, I've never felt the need to use this feature.  I find that when I get too many tabs, I just open a new browser window.  Then, you can actually drag tabs from one browser session to another.  Problem solved without this "tab groups" thing.  It seems like a lot of wasted development, which may have been part of why the download is so big and the final release was delayed so long.  Tab groups would have been better suited to a plug-in than a bloated browser feature.

What's most notable to me about Firefox 4 isn't so much what's in it, but what's missing.  The orange RSS feed button that showed in your URL bar when a site you visited had a feed is sadly gone.  It's replaced with a hidden toolbar button, much like the bookmarks button that you have to find in the toolbar customization dialog.  However, it's really difficult to tell when the button is lit up indicating an available feed, so it's just not nearly as nice as the original.  Mozilla got this feature right a long time ago before any of the other browsers did, and then they threw it all away.

The other head-scratcher is that they ditched the status bar, which is the bar that sits at the bottom of your screen and tells you what site you're about to go to when you hover over a link.  This move was presumably to mimic Chrome and regain some screen real estate for page content.  The trouble is, Mozilla also had this one right too and blew it.  Chrome lets its plugins take up precious space in the primary menu bar, whereas many Firefox plugins like Grease Monkey and Add Block Plus sit out of the way on the status bar.  So great is the love of the status bar, that there is a plugin called Status-4-Evar that brings it back which has over 100,000 downloads already.

As far as quick, stable and functional, Firefox 4 does well here.  But frankly, that's to be expected from a browser that isn't Internet Explorer.  But much the same way the only thing John Kerry had going for him was that he wasn't George W. Bush, I'm not sure that just being better than Internet Explorer is going to work out for Firefox in the long run.  And I'm not convinced they're that much better than IE anymore... with IE9 now clearly the best browser ever to come from Microsoft, as well as Chrome being the best browser available today, Firefox is going to have to step up its game.  They need to quit pulling features that work and people love, quit trying to be Chrome and doing it badly, and quit coming up with complicated solutions to simple problems in an effort to look innovative.

Sunday, March 20, 2011

Amazon Prime and our giant Sasquatch-sized carbon footprint

We recently tried Amazon Prime free for a month on the recommendation of a friend. Amazon Prime is a service offers where you get free two-day shipping on most everything you order directly from Amazon. The service runs $79 a year, but the idea is that with free shipping and frequent orders, enhanced customer service and saved trips out, it'll pay for itself quickly.

Let's say you keep a list of items you need from the store on your next trip.  With Amazon Prime, in theory, instead of keeping that list you could order the item from Amazon and it would probably arrive at your house before you made that next trip to the store(s).  If you used it that way, it would change the way you shopped.  And that was the way we tried to use it - as a way to order anything we thought we would have picked up on our next trip out.  The kids wanted to spend allowance on Nerf guns, so we did the research online and ordered in our PJs.  I wanted to get Beth a nice necklace for her birthday.  We knew we needed a baby gate and some odds-and-ends for Alison.  Whenever we thought of something we 'needed', we ordered it.

Of course, there were things that didn't qualify for the free 2-day shipping because Amazon wasn't the supplier, but just a storefront for someone else.  And there were things that weren't cheaper on Amazon, even with the free shipping.  And there was the knowledge that Amazon doesn't do sales tax, so I'd have to keep track of it for filing my state taxes next April.  And things like baby food had to be bought in bulk because you can't get that stuff in small packages online.  And every review was begging to be read for every purchase because there were there, and somehow random people on the internet started to have some say in what I bought.  The UPS guy started having conversations with me like we were old friends.  I was making two trips a week to the recycling drop-off to account for all the extra boxes.  And it wasn't just the boxes and deliveries started to add up - so did the credit card bill.  Not too bad, but certainly not typical.

And after a month of watching this occur, it became obvious to us what we were sacrificing so that we didn't have to plan for or think about our purchases.  What was the real cost of this convenience - in manpower, in shipping and handling, packaging, fuel, and ultimately to our expectations?  How had our attitude toward planning and budgeting and smart shopping changed?  How had our strong beliefs in living simply, spending wisely, budgeting effectively, and being good stewards of our world held up?  I wasn't sure I liked those answers, and so we happily let our month-long trial expire.

Saturday, March 19, 2011

Fun with DynamicObject and making .NET reflection less painful

If you're a Microsoft techie, you probably noticed that RC1 of Entity Framework 4.1 was released this week.  One of the things I've been waiting for with EF is the ability to run cross database queries on the same server - for example, being able to join MyDB.dbo.MyTable to YourDB.dbo.YourTable when MyDB and YourDB are SQL Server databases on the same server.  It's crazy that MS hasn't added this feature yet, since they seem to want to speed up EF adoption and since Linq-to-SQL does this perfectly.  But, I don't want to actually talk about this... this isn't what this post is about.  This issue is what started me on a different track.

You see, I wondered if there was a way to hack this functionality in by examining the EF code using Red Gate's Reflector tool.  As I examined the guts of EF, the parts that mattered were buried deep in the code - internal objects and private properties and methods.  Figuring out the code seemed like it would be way easier if I could examine the guts and change parts at run-time (I haven't figured this out by the way, so if you're here hoping I cracked this nut, you need to go back to Microsoft and the EF team and complain to them).  In order to really examine the code though, I'd need to use reflection and call methods and properties of private areas of the code, which is always a painful experience.  And that's when I turned to the new dynamic features of C#, and is the topic of this post.

If you don't know about the dynamic keyword in C#, it's purpose is to remove the compile time type safety on a variable.  If a variable is declared dynamic, you can call properties and methods on the variable and those calls aren't handled until run-time.  Those properties and methods may not exist, but you can intercept those calls and do some magic with them, like dynamically adding those properties or methods to the object.  See this stackoverflow question here and my answer to get some idea of how this works.  This sort of thing is nothing new for users of dynamic languages like Ruby, but for compile time checked languages, this is a really nice advancement.

So, all that to say, I decided to experiment with the dynamic features to provide a much simpler way of accessing private and protected aspects of a class at run-time.  I've code named this project "LookingGlass" as a play on reflection.  Here's a sample:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Dynamic;
using System.Reflection;

public class LookingGlass : DynamicObject {
 public dynamic TheObject { get; private set; }
 public Type TheObjectType;

 public LookingGlass(dynamic theObject) {
  this.TheObject = theObject;
  this.TheObjectType = this.TheObject.GetType();

 public override bool TryGetIndex(GetIndexBinder binder, object[] indexes, out object result) {
  result = null;
  int index = (int)indexes[0];
  var value = this.TheObject[index];
  result = new LookingGlass(value);
  return true;

 public override bool TryGetMember(GetMemberBinder binder, out object result) {
  result = null;

  var field = GetField(binder.Name);
  if (field != null) {
   var value = field.GetValue(this.TheObject);
   result = new LookingGlass(value);
   return true;

  var prop = GetProperty(binder.Name);
  if (prop != null) {
   var value = prop.GetValue(this.TheObject, null);
   result = new LookingGlass(value);
   return true;
  return false;

 public override bool TryInvokeMember(InvokeMemberBinder binder, object[] args, out object result) {
  result = null;
  var flags = BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.InvokeMethod;
  var value = this.TheObjectType.InvokeMember(binder.Name, flags, Type.DefaultBinder, this.TheObject, args);
  if (value != null) {
   result = new LookingGlass(value);
  return true;

 private FieldInfo GetField(string fieldName) {
  var flags = BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance;
  var field = this.TheObjectType.GetField(fieldName, flags);
  return field;

 private PropertyInfo GetProperty(string propertyName) {
  var flags = BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance;
  var prop = this.TheObjectType.GetProperty(propertyName, flags);
  return prop;

 public override string ToString() {
  return this.TheObject.ToString();

Obviously, this is quite a lot to take in, but the gist of this is that this lets you open up any object as if it's private fields, methods, and properties were all public.  It's not exactly fast, nor was it designed to be robust if the thing you're calling isn't part of that object, but this solved my immediate need of inspecting some of the Entity Framework guts at run time.  To use this, make yourself a class called Test() and put some private fields, methods, or properties on it and then call it like this:

var tst = new Test();
dynamic totallyOpenObj = new LookingGlass(tst);

You can also chain calls as each object returned is a new dynamic LookingGlass object that lets you explore it as well.  It handles overloaded methods and indexers too.  If you'd like to change the objects you're interrogating, DynamicObject lets you override TrySetMember and TrySetIndex and their ilk.  Hope this gives you some ideas on where you can go with the new dynamic features in C#.  This opens up a whole new world of possibilities for C# devs.  Happy coding!

Thursday, March 3, 2011


The fun things you find when digging through old hard drives... here are some poems I've run across from a very long time ago when I actually had time to write poetry...

“better to leave when you don’t want to”
She Says
but that kind of absence appeals only to the tortured
or to hermits
of which I am neither

but the idea seems so intriguing
That I Am
forced to consider
that maybe friends ought to try harder
to leave before they want to

if it means that
Her Very
last thoughts
were not wanting to leave
‘cause those thoughts were mine too

Best Friend

Mind you, I'm not saying these are necessarily any good...


Why? What is the point
in the sharing of joints
between two things that we eat with.
Why did they do it?
Who could construe it?
Did they think that it was a gift?

I wonder the maker.
a Monk or a Quaker
in some lab somewhere out there.
He’s crazy and old,
or bitter and cold
with some wild, freaky white hair.

I can picture the place
and the look on his face
when he came up with this “plan.”
I can see the light
as his face got bright
and he asked his friend for a hand.

They glued those utensils
like erasers and pencils
And stared again and again.
This monster’s to lewd
to use on your food
Stabbing with Siamese twins.

But they proceeded to patent
I wish that they hadn’t
subjected the world to this tool.
It’s called a spork,
it’s neither a spoon nor a fork
and to eat with it you look like a fool.

And now that I've weeded out the riff-raff with the last two, I'll share two more I really like with those who made it this far... This one reminds me of holding Alison...

A thousand kisses just aren't enough
to hold the memory of her touch
and millions more would not begin
to grasp the softness of her skin
Those big blue eyes and thoughtful gaze
float off, into her sleepy daze
And there upon my arm she sleeps
Moments fleeting I yearn to keep

And this one, which always makes me want to write again, and shows my appreciation for a really fine pen...
mightier than…
a green pen’s liquid
random, scribbled ink
Bleeding chlorophyll
breeding as it flows
Planted by the silky hands of an angel
an angel loved by mere primates
primates who’s hands could never hope
to leak such beauty
As the soundless words.
like the blood of The Savior
An angel with actions so fluid
that the seeds on the paper never thirst
Motion so liquid that one cannot help
but to thirst, to drink, to sip
to tenderly, gingerly, lift to the mouth
But by lifting the pen
the life flowing must cease
and leave the plants thirsty
begging for more

Thursday, February 17, 2011

Linq with XML literals

Every once in awhile I write something in VB that feels like the days-long-gone when I did crazy-kung-fu Perl/Ruby.  It's a strange mix of being proud, shocked, and slightly embarassed, but thrilled that the problem is fully solved.  Is it ugly - like a pitbull with lipstick, youbetcha!  But still, for those who really dig Linq and think that XML literals in VB are pretty slick, check this out - it converts a System.Data.DataTable to an XHTML table that I could then use in a quick-and-dirty e-mail alert I needed to send out based on the results of some ad-hoc queries.

Public Shared Function ToHtml(ByVal dt As DataTable) As System.Xml.Linq.XDocument
 Dim resultHtml = _
  <?xml version="1.0"?>
  <table name=<%= dt.TableName %>>
   <tr class="row0"><%= dt.Columns.Cast(Of DataColumn)().Select(Function(cl, idx) <th class=<%= "col" + (idx + 1).ToString() %>><%= cl.ColumnName %></th>) %></tr>
   <%= dt.AsEnumerable().Select(Function(rw, rowidx) <tr class=<%= "row" + (rowidx + 1).ToString() %>>
   <%= dt.Columns.Cast(Of DataColumn)().Select(Function(cl, idx) <td class=<%= "col" + (idx + 1).ToString() %>><%= rw(cl.ColumnName) %></td>) %>
   </tr>) %>

 Return resultHtml
End Function

Monday, February 14, 2011

Birthday kitty

Happy birthday Caméra.  You have undeniably made me into a cat person.

Sunday, February 13, 2011

My battle with iTunes continues

My battle with iTunes as the official worst-music-management-software-I've-ever-used continues.  I'm moving music from my old computer to my new computer and I'm trying to figure out how to keep my metadata.  Since I don't let iTunes manage my music as much as possible, I've just copied .mp3 files over.  Well, of course that means that I lose my album art work, which means I have to ask iTunes to get it for me again - and of course it never gets it right.  I wound up using Dropbox to save the correct artwork from my old machine for each one that iTunes got wrong and then album by album tediously corrected it on the new machine.  and Point to Me, but I still wind up the one who loses :(

However, now comes time to figure out my star ratings.  This one was a real head scratcher, until my Bing searches brought me to an answer - here's the process:

On the old computer, make 5 playlists, one for each star rating and add your music to those lists.  Then, export the playlist from the old computer and import into the new one.  It doesn't matter where the files reside, just that iTunes knows about them.  Then, select all the songs in each of those playlists and assign the star rating for all the songs in each playlist.  Genius!  The whole process took less than 10 minutes and made me much less frustrated than I thought I would be, and way easier than the album artwork issue.  Point to iTunes.  The stalemate continues...

Saturday, February 12, 2011

Dropbox and Junctions

If you haven't tried Dropbox, it's quite the nice little utility. It's 2GB of free online storage for whatever. Just sign up, download the app, and drop files in your Dropbox folder and they get synced so that you can access them from anywhere. There's even mobile apps so you can get at your files on your iPhone or Andriod device. Nice.

The main trouble people seem to have with it is that they don't want to store their files in their dropbox folder, so they have to copy their files from the original place and try to keep them synced.  Here's a little trick I've been using and it works great.  Windows Vista and above has a built-in feature called junctions.  Junctions, for those initiated in the Unix world, are essentially hard links.  Basically the concept is you have a folder for a few photos here: "C:\Users\MattMc3\My Photos\Vacation Pix 2011".  You'd like those to by synced via dropbox, which is located here: "C:\Users\MattMc3\Dropbox".  You can create a junction from your Dropbox folder into your Vacation Pix folder and when you delete from one, you'll delete from both.  When you change one, you'll change both.  When you add to one, you'll add to both.  The folders are the same... you don't have a copy, you have a mirror.

The magic is in a little DOS command called mklink.  On your Windows 7 or Vista computer with dropbox already installed, do the following:

  1. Start Menu -> type "cmd" to get to a command line
  2. Change your directory to your dropbox folder by typing cd "C:\Users\MattMc3\Dropbox"
  3. Type: mklink /J Vacation Pix "C:\Users\MattMc3\My Photos\Vacation Pix 2011"
  4. Watch in awe as Dropbox picks up your files and syncs them.  It doesn't matter which folder you drop a file, it'll sync both ways because this isn't a copy - it's a mirror. 
 Now, 2GB really isn't enough to store a lot of photos, but I have been running VisualSVN as my source code repository for awhile and I wanted to get it synced to Dropbox.  This is source code all the way back to Pascal code from High School, C++ from college, and personal projects from 10+ years of .NET.  I don't really need a 2-way mirror backup, I just wanted to keep C:\SVN as my repository location and still get backups.  Thanks to junctions, I can.

Monday, January 31, 2011

More fun with Linq - DateTime Quarters

Here's a fun little query to get the quarters for a calendar year... Put this straight into LinqPad and check it out. Of course, LinqPad still requires EOL underscores for line continuations hence the syntactic clutter.

Dim quarters = _
   From q In Enumerable.Range(1, 4).Select(Function(x) New DateTime(DateTime.Now.Year, ((x-1) * 3) + 1, 1)) _
   Select New With { _
      .QtrStart = q, _
      .QtrEnd = q.AddMonths(3).AddMilliseconds(-1) _

Thursday, January 27, 2011

Pratical Linq

So my last post on Linq is a bit short on details.  Here's a problem I was recently trying to solve.  Our company's 4-digit postal code extension changed, and our corporate Outlook signature files all needed updated.  There doesn't appear to be any easy way to do this via any of the tools we have.  However, we are using roaming profiles in Windows so all of our corporate profiles are in one common network location.  The trouble is, how do we query for all the signature files and how do we do it FAST?

There are 250,000+ files in 100,000+ folders in the roaming profiles area.  Imagine how long this would take with a simplistic System.IO.Directory.GetFiles() call that then dug through 250,000+ files trying to find what I wanted.  No way!

Linq to the rescue!

First off, Outlook signatures are located off the Application Data folder where there are a ton of other files.  On our network, all users' Windows application profile folders are here: "\\serverXYZ\RoamingData\Applications\".  From this root directory, you get into the users' directories and then each user has their own signature folder.  So, John Doe's signatures are here: "\\serverXYZ\RoamingData\Applications\John.Doe\Application Data\Microsoft\Signatures\" and Jane Doe's are here: "\\serverXYZ\RoamingData\Applications\Jane.Doe\Application Data\Microsoft\Signatures\".  I don't want to search every single directory for sigs since I should be able to get into only the directories I need.  So here's the Linq query I used to get there - by the way, this is all in VB.NET using Option Infer:

Dim baseDir = "\\serverXYZ\RoamingData\Applications"
Dim sigDirs =
    From a In Directory.GetDirectories(baseDir)
    Let sigPath = New DirectoryInfo(Path.Combine(a, "Application Data\Microsoft\Signatures"))
    Where sigPath.Exists
    Select sigPath

By habit and convention, I tend to name my selection variables in my Linq queries a, b, c, etc...  I used the magic "Let" keyword to assign a variable in the middle of my query.  This query is pretty simple - it starts off in the base directory I mentioned earlier to get the users' directories, and then it assembles their signature directory and returns an IEnumerable(Of System.IO.DirectoryInfo) of all the paths that exist.

Now, I need to get the signature files themselves.  Outlook makes 3 signature files, one in text format, one in rich text format, and an HTML one.  I decided to hinge everything off of finding the .txt file first.  I used the little understood and slightly dreaded SelectMany() extension.  If you are unfamiliar, here's a better explaination that I could hope to give.  So, here is the query to get the sig files:

Dim sigFiles =
    From a In sigDirs.SelectMany(Function(d) Directory.GetFiles(d.FullName, "*.txt"))
    Let txt = New FileInfo(a)
    Let rtf = New FileInfo(Path.ChangeExtension(a, "rtf"))
    Let htm = New FileInfo(Path.ChangeExtension(a, "htm"))
    Where rtf.Exists _
    AndAlso htm.Exists
    Order By txt.FullName
    Select New With {.Txt = txt, .Rtf = rtf, .Htm = htm}

This query looks in all the users' signature directories and gets all the .txt files from that directory where there's also an RTF and an HTM version of the same file.  It returns an IEnumerable(Of ) with .Txt, .Rtf, and .Htm System.IO.FileInfo properties.

Now, I have all the signature files.  And I know that they are stored in the Latin-1 format, so I'm going to iterate through them and do the replacement.  I'm making some assumptions here, like that the zip codes in the RTF and HTM files aren't bisected by style elements, but based on analysis of the files, I know that to be a reasonable tactic.

So here's the final VB.NET method in its 30 lines of splendor:

Public Shared Sub DoSignatureFileUpdate(ByVal oldZip As String, ByVal newZip As String)
    Dim baseDir = "\\serverXYZ\RoamingData\Applications"
    Dim result As New List(Of String)()

    Dim sigDirs =
        From a In Directory.GetDirectories(baseDir)
        Let sigPath = New DirectoryInfo(Path.Combine(a, "Application Data\Microsoft\Signatures"))
        Where sigPath.Exists
        Select sigPath

    Dim sigFiles =
        From a In sigDirs.SelectMany(Function(d) Directory.GetFiles(d.FullName, "*.txt"))
        Let txt = New FileInfo(a)
        Let rtf = New FileInfo(Path.ChangeExtension(a, "rtf"))
        Let htm = New FileInfo(Path.ChangeExtension(a, "htm"))
        Where rtf.Exists _
        AndAlso htm.Exists
        Order By txt.FullName
        Select New With {.Txt = txt, .Rtf = rtf, .Htm = htm}

    For Each fnfo In sigFiles.SelectMany(Function(x) {x.Txt, x.Rtf, x.Htm})
        Dim latin1 = System.Text.Encoding.GetEncoding(1252)
        Dim contents = File.ReadAllText(fnfo.FullName, latin1)
        Dim newContents = contents.Replace(oldZip, newZip)
        If newContents <> contents Then
            File.WriteAllText(fnfo.FullName, newContents, latin1)
        End If
End Sub

Really powerful stuff.  This whole process took less than 2 seconds to run.

Wednesday, January 19, 2011

Linq is awesome

I have said this at work at least once a day for the past 3 years.  It makes my job so much easier.  How did I ever manage without it?  Linq is AWESOME.  That is all.