December 16, 2010

Now I'm Here

I've reached a tipping point--in fact I may already be moving past it. I can barely recollect, just a bit less than two months ago, when I first started working at CreditKarma. My memory isn't strong, so perhaps I am crossing this line earlier than most--or perhaps others, when they cross this line, have a greater appreciation for what's been left behind. Regardless, I'm feeling the newness slip away, and I don't yet know what's come to take its place. It's some kind of bad timing (that I hope will seem, in hindsight, to be perfect fate) that this was when I should break up with Sabrina. She bridged the gap between college, job-seeking, and my entrance into "real life", and I am incredibly grateful for the way she eased that transition. It feels cruel of me to reward her only by moving on... but I must stop thinking this way. It wasn't something I chose, and I certainly would have chosen differently if it had been in my power to decide.
In any case, it symbolically if not literally cuts my last tie to the past that feels so distant, and so recent. And although at the moment I feel rather lost, I can't say it left me with nothing. I have a job, one which I don't have to merely tolerate, which brings me more security and material comforts than I have ever experienced. And I have in her a friend, at the very least, who if time allows may even provide guidance in finding a more fitting, fulfilling relationship. I have other friends as well, here and there, and with time and luck I may even find out what makes that so crucial. I have everything I need, and more, and all I had to exchange for it was everything that I held close, and found comfort in. If only I could remember my past self, maybe he could convince me of how lucky I am.

December 6, 2010

I Guess I'm a Sysadmin Now

I know it's been a long time since I've written. The reason for this, I'm telling myself (literally!), is that I've been working 9-hour days with a 1.5 hour commute (each way), and as a result I have about an hour a day to myself. (Being on the train, as much as I might expect and hope that it could be productive, relaxing time, doesn't count.) Weekends now feel like an indulgence, but I can at least be grateful that I'm starting to learn *some* respect for the amount of time that I do have. Funny how we can never appreciate it when we've got too much of it. Well, no need to beat myself up about it. At least now I have much more money than I need. :D I still don't think I've processed that yet--or I've processed it far too quickly to notice.
Anyway, what's new? I have been working for more than a month now, which shocks me even more than the fact that I seem to be getting pretty good at it. I understand a lot of our system now, I'm actively participating in keeping the new features of our system consistent and bug-free(ish), and I'm tackling duties above and beyond my job description. The most difficult of these has got to be the fact that I'm now the office sysadmin. I've never been very good at networking, and as a result I've never done it much (or maybe that's the cause). But who else was there to do it? *Nobody* here is a network guy, and I just happen to be the guy with the most time on his hands. And the fact is I do have a lot of theoretically knowledge of the domain, and I've always claimed to be a fast learner, so there are worse candidates. I'm not sure whether I could realistically say no--at the very least, I had to give it my best shot. And if I succeed, it'll at least look good on my resume.
As a result of this thinking, I've almost come to embrace the position. It's not without friction--I had, and still have, a *lot* to learn about networking. And there's this additional little inconvenience that I can't work on the network when anyone's there... meaning I have to stay late, meaning I have to lose sleep. But that only makes it clearer that I'm going above and beyond, so hopefully my efforts (and my successes! And my failures...) will be quite visible. Tonight, though, I think I may have succeeded at something. The new firewall box gave me a *lot* of grief, but I'm pretty sure I got it working, and what's more, this time I didn't walk away from it with my fingers crossed. After several weeks of bashing my head against this, I feel like I've passed a watershed (and a *successful* one, this time!). I really think I understood what I was doing. And if I can get it working the rest of the way, it will all have been worth it.

November 16, 2010

Display Bug In Google Instant

So proud of myself right now. It was discovered completely by accident, of course, but I still can't wipe this grin off my face.
I'm three weeks into my new job doing software QA, and I think I'm really taking to it. I just found a bug in Google. :D
Basically, Instant will go buggy on you if you hit the delete key (not backspace) in a fresh reload of Google.com (with Instant enabled, naturally). Playing around with the background image switcher will enhance the apparent faultiness of the results, but it's not required. Here's a few screenshots.

Normal Google with background image
After hitting delete key
After clicking "Remove background image" link

September 15, 2010

Eureka, Again...

I was walking around Eureka today, taking some photos with my new camera, and as usual I got to thinking--this time about digital cameras. I've noticed them explode in popularity in just the last six years or so, concurrent with their meteoric rise in capability, to the extent that there is now hardly anything film can do that digital can't do better. The digital method is, although I admit some prejudice here, a much more logical (if not quite as straightforward) method of recording visual input, and it crossed my mind that if digital cameras came first, the analog method, when it surfaced, would seem quite unconventional and marvelously clever. Rather than interpreting the input and distilling it into data, the analog input is instead transferred directly to another physical format, one which is static and long-lasting. It is, when it comes down to it, another method of storage, but it is a method particularly suited to its domain.
This line of thought brought me to the wholly unoriginal insight that digital processing is designed to be domain-general. It still exists in the physical world, of course, but the physical has been distilled out of it to the greatest extent possible. The majority of processing is done by clever arrangements of a single physical object--the transistor--with domain-specific hardware needed only for the interfaces to the outside world (the buttons, the displays, the light sensors, etc.). "Processing" has been purified and commoditized, and in doing so has taken the concept of "economy of scale" to a level never before seen in human history. Like most commodities, there is a healthy amount of competition in processor development, and the product has been further refined and expanded upon... but it's still the same basic product.
Processing, as a concept, need not be limited to how we think of it today. Film cameras, after all, still process and store information in their own way, and although it is a highly domain-specific method, it is also highly efficient. By thinking of processing as a theoretical task, rather than a uniform commodity, I realized that the super-fast single-stream CPU that dominates computing today is not the only option--and that its monopoly may leave other potential approaches withering in the shadows. Multi-core processors are a step in a new direction, but they are still merely an adaptation of the existing paradigm, weighted down by the heavy burden of their pedigree. The tortured adaptation of popular programming languages in an attempt to ease parallelism is evidence that a fresh start is needed.
I'm aware, or at least I suspect, that stripping "processing" of some of its uniformity will injure one of computing's greatest strengths. Nevertheless I think it might be just what is needed to wring more concrete benefits from Moore's law. Even exponential advancement in single-stream processing will only produce minimal benefits for problems best suited to a parallel approach, and meanwhile the problems best suited to linear, step-by-step processing already have more than enough speed. Moore's law is creating a surplus of a currency we can't spend, and the exchange rates are usurious.

I should note, for the record, that this post was not researched, and there's a chance that all of this blathering might be nothing more than me reinventing a square wheel. I know that parallelism is not a new concept, and frankly I'd be surprised if a dedicated parallel programming language hadn't been invented by now (albeit perhaps an ancient, user-unfriendly language like Lisp or Haskell). I'm also not aware of the state of processing hardware, but there certainly seems to be something of a monoculture going on--at least in the commercial sphere. But I'm no stranger to intuiting existing insights, and it's actually kind of fun. If anything might be new in all of this rambling, it's the realization that processing is treated as a commodity, and that this unspoken assumption may be the root of today's difficulties in transitioning towards parallelism. Also, this post was not initially all about parallel processing--that's just the example that sprung to mind. My aim is to conceive of processing, even digital processing, as a broader concept than just the classic CPU. That's an insight... isn't it?

August 1, 2010

Two and a half months later...

May 25, 2010

Discontempointment

(Alternative title: Discontinuity, disappointment, contentment.)

After I graduated, I expected some amount of discontinuity. I was hoping it wouldn't be a tumultuous change, but I knew that I was in for a whole lot of change, and I was even looking forward to it. But it hasn't come yet. I'm back at home--although it's not "going home", of course--and still trying to figure out how this new life will start. My old life seems to have ended, and thankfully it was gradual, but at the same time it lacks certainty. Oddly enough... I think Lost may have played a big role, too.
The show ended on Sunday night, by the way, and I ended up pretty disappointed in the writers. It was a great show, and the ending was equally great, but they don't match up--I feel like I watched the finale of some other show, that didn't have any big mysteries or sci-fi elements, but just characters interacting and going about their lives with some small amount of drama. Cheers, maybe. I suspected for some time that it would leave me disappointed, but... well, a major theme in Lost has been faith vs. reason, and you can probably guess which side I tend to fall on. Nevertheless, I kept watching in hopes that it would all make sense by the end--and if nothing else, Lost has certainly taught us the reward for faith.
But anyway, I had watched Lost pretty devotedly for the last four years or so, which was coincidentally the majority of my time at college. So the ending of Lost, if nothing else, at least provided a bit of discontinuity to mark the end of my college years. I wonder if that's enough.

April 28, 2010

Nothing New Under the Sun

You can consider this a wrap-up post for my academic career. I don't mean that I intend to stop learning, or even stop attending classes, but I've been going to school since I can remember, and in three weeks I won't be going to school anymore. I'm finally graduating from college, and at this point in my life I'd like to take a break--by which I mean I intend to enter the working world, rather than continue on to graduate school. It's only fitting that I should be reminded, now, about something I've noticed more and more frequently as I accumulated knowledge.
Don't take the title too literally--I am certain that there are things yet to be discovered. But as history piles up behind us, there seems to be less and less to discover. Humanity certainly knows more now than any single human could figure out on his own, even over an entire lifetime. As a result, a person's level of knowledge is more a measure of how exhaustively he has gathered up the lessons that others have learned. This isn't directly linked to years of formal education, of course... but it's not far off, either.
I speak from experience. Many times in the past, I have discovered something clever or insightful (often about philosophy, psychology, or the human condition), only to find out, sometimes years later, that it had already been discovered centuries ago, and written down for everyone to know. I'm not opposed to sharing knowledge... but the results of it can be disenchanting, sometimes.
I haven't mentioned it before because I could never recall a specific example. But in this case, it happened moments ago. I'm pretty sure I just figured out the Sieve of Eratosthenes.

There I was, moments ago, sitting around thinking about primes. (Not my usual occupation, I can assure you.) I started thinking about the quest for a pattern behind primes--a way to predict them. It occurred to me that there was a pattern, but it wasn't a pattern of primes. It was a pattern of factors. Imagine a number line of integers, with a line (like a sine wave) going along it, passing through 2, 4, 6, 8, etc. The integers it passes through are even numbers, and the gaps are not. The next number that occurs in a gap has another line starting at it, and recurring at 6, 9, 12, etc. There are now some overlaps (which may have significance, but not at the moment), but fewer gaps. Now repeat. It becomes a cornucopia of interwoven patterns, and the patterns are all very simple... but their interaction is not. No matter how many lines there are, there are always gaps. And as thick as the nest becomes, there's always another line arising from the prime in the gap, to add its pattern to the rest.
Which is a much better visual than the GIF in that Wikipedia page (just imagine all the primes glowing... and maybe the lines start out thick and taper as they go... isn't that pretty?). But I'm still kind of disappointed that someone else thought of it first. Rational or not... I feel as if I've arrived at a party thousands of years late, and all the beer's gone.

February 4, 2010

The Joys of Linux

I'm taking a course on Python this semester. I'm really excited about it; there really is no other word for it. I'm sure that my excitement over something as simple as, say, simultaneous assignment, won't be easily conveyed to non-programmers or even anyone besides myself. Nevertheless, the new toys in Python, and particularly the new paradigms (I have never dabbled in functional programming before now, but the potentials offered by yield and generators are making me positively giddy), are very exciting.
The course is just now migrating to Python 3, so in a sense it's good that I didn't get started with the language until now. Unfortunately the default Python documentation (http://docs.python.org/) is for 2.6.4, so it's very easy to get tripped up by outdated information. In a yet more sinister turn, Ubuntu itself uses 2.6 by default! I didn't think this was much of a problem at first. A bit of Googling eventually schooled me in the proper use of update-alternatives, and soon I had set the default python command to call python3. But then the problems came.
The first sign I had that something was wrong was a big red "do not enter"-style notification at the top right, telling me that the update manager wasn't working. I won't go into the details, but I tried all I could to fix it, and somehow I fooled around with dpkg to the extent that GRUB now displayed my install as "Debian". I had all but given up, and I was on the verge of reinstalling, when I booted up today and realized that Dropbox wasn't running, either. Running dropbox in a terminal gave me some strange error, which seemed to hint that the contents of the file were text, not the binary executable itself. So, on a hunch, I opened up /usr/bin/dropbox in a text editor, and what do I see?
#!/usr/bin/python
The dropbox script was formatted for Python 2.x (it was Unicode strings causing this particular error), and my messing around with the default python command was what caused my problems. Evidently the update manager also relied on Python, because resetting the default command back to 2.6 allowed the update manager to run again, and at the moment it looks like everything's going to be okay.

UPDATE: Everything was not okay.

January 17, 2010

Searching for a Clearer View

I'm running on four hours of sleep, and not a whole lot from the previous few nights either, so at the moment (and probably for the rest of the day) I am wracked with the symptoms of sleep deprivation. Nevertheless, I have just caught a glimpse of what it might mean to be a "morning person".
Two things to note here: The first is my penprevious post, in which I used the term "incredible lucid solitude". The second is that I have resolved (only incidentally in January) to wake up at 6 am every day for the entire semester, on the grounds that I will adjust, and it will be easier than waking up at 6 am (thoroughly unadjusted) two days a week. Today was the first day in weeks that I've woken up so early, and I am quite emphatically Not Adjusted yet.
Anyhow, my point is that I have just experienced something akin to the normal lucidity that for me is more common to staying up incredibly late, not waking up incredibly early. The drowsiness has dulled it, of course, but I get the feeling that once I have adjusted, I'll have five days a week containing several hours of clearheadedness, insight, and inner peace. Assuming, of course, that I am capable of adjusting. I've never been a morning person, but I'm starting to see that it might not be so different from being a night owl, after all. Just colder.

January 12, 2010

Clam's Search for Meaning

Sometimes I feel as if I lead a charmed life. I feel privileged, knowing that I can achieve anything I put my mind to, and I remind myself not to take that privilege for granted.
But sometimes I get the feeling that I've been spending my life diligently avoiding any desires that can't be fulfilled. I can always have anything I want, as long as I'm careful about not wanting what I can't have. Far from being uplifting, this is really very sad. What it means is that I can't fail, so not only do I stop before attempting risky-but-surmountable heights, but I also avoid the learning experiences of failure. It's kind of a fox and the grapes thing, although I don't think I'm bitter about it.
This is an exaggeration, of course. I have taken risks before, and I have not led a life entirely free of failure. The degree of it, though, is really quite humbling. I've been trying to write my resume, and I've got it pretty well sorted out now, but I am choking on the "personal statement" part. I know that my first job out of college doesn't determine my entire career, but I'm expected to have some idea of what I want to do in life... a passion. And right now, I'm feeling pretty dispassionate.
It seems to me that the reason for this lack of passion is fear. What I'm going to do in a job is very different from what I've been doing in school, so by necessity my passion will be something I've never done before--I need to be moving forward, essentially. What that means is that I need to want something that hasn't been shown (through experience) to be achievable... so I feel like I'm taking a leap of faith. Yes, I know I've been learning this stuff in school, and I've even been pretty successful doing actual work at my internships. But it's still just far enough out there to give me fear, and we all know what the Bene Gesserit say about fear.
So what are my options? Well, if I'm opposed to stagnation (which, for the record, I am), it seems like there's no option but to face the fear and overcome it. Kind of platitudinous, I admit, but it's much easier said than done. How to face it is the real question. I'll get back to you when I figure out an answer.

January 1, 2010

Waking Up Is Hard to Do

Whenever I find myself scheduling an activity before 6 am (or, more likely, being scheduled), I generally choose to go without sleep rather than wake up that early. Funny that I've never noticed the pattern before, or at least never noticed how tenacious it is. It's not surprising, given that falling asleep can be a difficult thing to do on demand, while waking up on demand is merely painful, not hard to accomplish. In some cases it is the only sane option (as tonight, when the potential sleeping time was between drinking champagne at midnight and heading out at 4 am), and in some cases it can be insanely inadvisable (as my weekend trip to London, when I tried to save a night's hostel money by simply strolling around until my 6 am flight out). There ought to be something profound about the fact that I so despise waking up... or more importantly, the incredible lucid solitude only found at 5 am, when all the world's asleep.
At the moment, though, I can't think of what it is, and I'm just writing this down so I won't forget. I'm awfully tired, you see...