March 13, 2013

...And Now I'm Back

I started this blog as a college student, just as I was beginning the first of two summers in Germany. Those were interesting times for me, and I had plenty to write about on a regular basis. As you may have noticed, posts have slacked off a bit since then. I've posted almost nothing, as a matter of fact, since entering the world of full-time employment. You might take from that the message that my life has become significantly less interesting... and I'd be hard-pressed to argue against it. I think it is time for a change.
As much as I enjoy my work (and I'm sure I'll have an update on my progress before long), I wouldn't call it my life's work. I'd like to do something with my life besides just further it. Something that doesn't just keep me busy, or keep me fed. Something worth telling.
Nevertheless, I'm trying to become a writer.
I'm not sure how interesting this is to an outside observer, but I doubt anyone's reading this blog anyway. This is just another exercise... an excuse, potentially motivated by external pressure, to practice putting words together on a regular basis. I've been doing a better job of that lately, but part of that is showing my work to the public, getting some feedback maybe and a chance to let my writing stand on its own.
I don't have anything for you now, of course... and this blog will probably remain a stream-of-consciousness thing more than a portfolio... but keep an eye out if you're interested in seeing anything I'm interested in posting. It'll happen here.

December 16, 2010

Now I'm Here

I've reached a tipping point--in fact I may already be moving past it. I can barely recollect, just a bit less than two months ago, when I first started working at CreditKarma. My memory isn't strong, so perhaps I am crossing this line earlier than most--or perhaps others, when they cross this line, have a greater appreciation for what's been left behind. Regardless, I'm feeling the newness slip away, and I don't yet know what's come to take its place. It's some kind of bad timing (that I hope will seem, in hindsight, to be perfect fate) that this was when I should break up with Sabrina. She bridged the gap between college, job-seeking, and my entrance into "real life", and I am incredibly grateful for the way she eased that transition. It feels cruel of me to reward her only by moving on... but I must stop thinking this way. It wasn't something I chose, and I certainly would have chosen differently if it had been in my power to decide.
In any case, it symbolically if not literally cuts my last tie to the past that feels so distant, and so recent. And although at the moment I feel rather lost, I can't say it left me with nothing. I have a job, one which I don't have to merely tolerate, which brings me more security and material comforts than I have ever experienced. And I have in her a friend, at the very least, who if time allows may even provide guidance in finding a more fitting, fulfilling relationship. I have other friends as well, here and there, and with time and luck I may even find out what makes that so crucial. I have everything I need, and more, and all I had to exchange for it was everything that I held close, and found comfort in. If only I could remember my past self, maybe he could convince me of how lucky I am.

December 6, 2010

I Guess I'm a Sysadmin Now

I know it's been a long time since I've written. The reason for this, I'm telling myself (literally!), is that I've been working 9-hour days with a 1.5 hour commute (each way), and as a result I have about an hour a day to myself. (Being on the train, as much as I might expect and hope that it could be productive, relaxing time, doesn't count.) Weekends now feel like an indulgence, but I can at least be grateful that I'm starting to learn *some* respect for the amount of time that I do have. Funny how we can never appreciate it when we've got too much of it. Well, no need to beat myself up about it. At least now I have much more money than I need. :D I still don't think I've processed that yet--or I've processed it far too quickly to notice.
Anyway, what's new? I have been working for more than a month now, which shocks me even more than the fact that I seem to be getting pretty good at it. I understand a lot of our system now, I'm actively participating in keeping the new features of our system consistent and bug-free(ish), and I'm tackling duties above and beyond my job description. The most difficult of these has got to be the fact that I'm now the office sysadmin. I've never been very good at networking, and as a result I've never done it much (or maybe that's the cause). But who else was there to do it? *Nobody* here is a network guy, and I just happen to be the guy with the most time on his hands. And the fact is I do have a lot of theoretically knowledge of the domain, and I've always claimed to be a fast learner, so there are worse candidates. I'm not sure whether I could realistically say no--at the very least, I had to give it my best shot. And if I succeed, it'll at least look good on my resume.
As a result of this thinking, I've almost come to embrace the position. It's not without friction--I had, and still have, a *lot* to learn about networking. And there's this additional little inconvenience that I can't work on the network when anyone's there... meaning I have to stay late, meaning I have to lose sleep. But that only makes it clearer that I'm going above and beyond, so hopefully my efforts (and my successes! And my failures...) will be quite visible. Tonight, though, I think I may have succeeded at something. The new firewall box gave me a *lot* of grief, but I'm pretty sure I got it working, and what's more, this time I didn't walk away from it with my fingers crossed. After several weeks of bashing my head against this, I feel like I've passed a watershed (and a *successful* one, this time!). I really think I understood what I was doing. And if I can get it working the rest of the way, it will all have been worth it.

November 16, 2010

Display Bug In Google Instant

So proud of myself right now. It was discovered completely by accident, of course, but I still can't wipe this grin off my face.
I'm three weeks into my new job doing software QA, and I think I'm really taking to it. I just found a bug in Google. :D
Basically, Instant will go buggy on you if you hit the delete key (not backspace) in a fresh reload of Google.com (with Instant enabled, naturally). Playing around with the background image switcher will enhance the apparent faultiness of the results, but it's not required. Here's a few screenshots.

Normal Google with background image
After hitting delete key
After clicking "Remove background image" link

September 15, 2010

Eureka, Again...

I was walking around Eureka today, taking some photos with my new camera, and as usual I got to thinking--this time about digital cameras. I've noticed them explode in popularity in just the last six years or so, concurrent with their meteoric rise in capability, to the extent that there is now hardly anything film can do that digital can't do better. The digital method is, although I admit some prejudice here, a much more logical (if not quite as straightforward) method of recording visual input, and it crossed my mind that if digital cameras came first, the analog method, when it surfaced, would seem quite unconventional and marvelously clever. Rather than interpreting the input and distilling it into data, the analog input is instead transferred directly to another physical format, one which is static and long-lasting. It is, when it comes down to it, another method of storage, but it is a method particularly suited to its domain.
This line of thought brought me to the wholly unoriginal insight that digital processing is designed to be domain-general. It still exists in the physical world, of course, but the physical has been distilled out of it to the greatest extent possible. The majority of processing is done by clever arrangements of a single physical object--the transistor--with domain-specific hardware needed only for the interfaces to the outside world (the buttons, the displays, the light sensors, etc.). "Processing" has been purified and commoditized, and in doing so has taken the concept of "economy of scale" to a level never before seen in human history. Like most commodities, there is a healthy amount of competition in processor development, and the product has been further refined and expanded upon... but it's still the same basic product.
Processing, as a concept, need not be limited to how we think of it today. Film cameras, after all, still process and store information in their own way, and although it is a highly domain-specific method, it is also highly efficient. By thinking of processing as a theoretical task, rather than a uniform commodity, I realized that the super-fast single-stream CPU that dominates computing today is not the only option--and that its monopoly may leave other potential approaches withering in the shadows. Multi-core processors are a step in a new direction, but they are still merely an adaptation of the existing paradigm, weighted down by the heavy burden of their pedigree. The tortured adaptation of popular programming languages in an attempt to ease parallelism is evidence that a fresh start is needed.
I'm aware, or at least I suspect, that stripping "processing" of some of its uniformity will injure one of computing's greatest strengths. Nevertheless I think it might be just what is needed to wring more concrete benefits from Moore's law. Even exponential advancement in single-stream processing will only produce minimal benefits for problems best suited to a parallel approach, and meanwhile the problems best suited to linear, step-by-step processing already have more than enough speed. Moore's law is creating a surplus of a currency we can't spend, and the exchange rates are usurious.

I should note, for the record, that this post was not researched, and there's a chance that all of this blathering might be nothing more than me reinventing a square wheel. I know that parallelism is not a new concept, and frankly I'd be surprised if a dedicated parallel programming language hadn't been invented by now (albeit perhaps an ancient, user-unfriendly language like Lisp or Haskell). I'm also not aware of the state of processing hardware, but there certainly seems to be something of a monoculture going on--at least in the commercial sphere. But I'm no stranger to intuiting existing insights, and it's actually kind of fun. If anything might be new in all of this rambling, it's the realization that processing is treated as a commodity, and that this unspoken assumption may be the root of today's difficulties in transitioning towards parallelism. Also, this post was not initially all about parallel processing--that's just the example that sprung to mind. My aim is to conceive of processing, even digital processing, as a broader concept than just the classic CPU. That's an insight... isn't it?

August 1, 2010

Two and a half months later...

May 25, 2010

Discontempointment

(Alternative title: Discontinuity, disappointment, contentment.)

After I graduated, I expected some amount of discontinuity. I was hoping it wouldn't be a tumultuous change, but I knew that I was in for a whole lot of change, and I was even looking forward to it. But it hasn't come yet. I'm back at home--although it's not "going home", of course--and still trying to figure out how this new life will start. My old life seems to have ended, and thankfully it was gradual, but at the same time it lacks certainty. Oddly enough... I think Lost may have played a big role, too.
The show ended on Sunday night, by the way, and I ended up pretty disappointed in the writers. It was a great show, and the ending was equally great, but they don't match up--I feel like I watched the finale of some other show, that didn't have any big mysteries or sci-fi elements, but just characters interacting and going about their lives with some small amount of drama. Cheers, maybe. I suspected for some time that it would leave me disappointed, but... well, a major theme in Lost has been faith vs. reason, and you can probably guess which side I tend to fall on. Nevertheless, I kept watching in hopes that it would all make sense by the end--and if nothing else, Lost has certainly taught us the reward for faith.
But anyway, I had watched Lost pretty devotedly for the last four years or so, which was coincidentally the majority of my time at college. So the ending of Lost, if nothing else, at least provided a bit of discontinuity to mark the end of my college years. I wonder if that's enough.