Archive for January, 2010

DE Razor Reviews Summary

Saturday, January 30th, 2010

Many months ago, I went from a half inch or greater beard to clean shaven. After a few weeks of dealing with cartridge blades, my lovely wife Christine gave me a very nice double-edged safety razor set. A bit of research revealed that the choice of blade was critical to the experience.

Blades Scaled.png

Being who I am, the only logical conclusion was to buy a sampler pack covering the most popular blade and try a new blade each week to find one that I liked (and to actually determine whether or not the brand of blade even matters)!

Now that I’m done, I have encountered a few surprises and amassed a handful of questions along the way. So, in a summary:

Why bother?

If you are going to keep a clean shaven face, a double-edged safety razor strikes a nice balance of price optimization and performance.

First and foremost, the quality of shaved produced by a double-edged safety razor is way beyond that provided by a cartridge razor. Notably, my face stays clean shaven for far far longer. With a cartridge razor, an 8 a.m. shave would lead to a rough face by 4 p.m. With a decent double-edged blade, I still have a smooth face 11 p.m. Beyond that, my skin is a hell of a lot less irritated.

Conveniently, it is also comparatively cheap. Cartridge blades are stupid expensive — dollars — whereas a decent quality double-edged safety blade is all of about 6 cents per blade in bulk.

To achieve a decent shave with a DE safety razor requires paying a bit of attention and taking things a bit slow. After a few weeks, it becomes a bit of morning ritual… a bit of calm amongst the storm that is my life.

Which razor am I using and what cream?

The one at the left. It is a relatively plain Merkur straight safety razor. It is of medium weight and not that aggressive (i.e. the space between guide and blade isn’t that wide). Feels good in the hand and is easy to clean.

Before the shave, I use a bit of oil rubbed into my face. As for cream, I’m using the house brand from the art of shaving applied with a badger hair brush. Works well enough, but I’ll probably try some others as this runs out.



Read the rest of this entry »

Solar Install Part 1: The Madness of Eichler Roofs

Monday, January 18th, 2010
One Row of Solar Panels

As a part of our ongoing home improvement adventure, we are installing solar panels. Between the state and federal rebates, the increasing cost of electricity, and the improvements in solar technology, it is an investment that will pay for itself in a decade or two. Maybe less, if California really starts paying for excess production.

And, of course, Solar scratches my techno-geek itch. In particular, the system we are installing uses per-panel micro-inverters that leverage IP-over-powerline to network with each other to synchronize phase and deliver power back to the grid. As well, it makes the system easily expandable in that we can drop new panels in without having to replace a costly single inverter.

Single Micro-Inverter

Apparently, when this is all said and done, I’ll have access to a web site with a set of schematics that show our panel layout along with individual and overall power generation statistics.

Of course, being that we live in an Eichler, the path between concept and final installation has to have at least one adventure.




Read the rest of this entry »

DE Razor Review: Sharp Stainless

Sunday, January 10th, 2010
Sharp Scaled.jpg

I shouldn’t be surprised that the last blade from the sampler that was new to me would yield some unexpected results (I still have the Merkur to review, but that was the blade my razor came with and, thus, I’m saving it to the last. That and I’m likely going to be reviewing a different kind of Israeli Personna and updating that particular review, too).

The Sharp (stainless) come in just about the most unassuming packaging of any blade; a simple cardboard box. Yet, that simple cardboard box has a bit of hologram embedded in it! Most likely, this “seal of authenticity” is an attempt to stymy counterfeiters, which — apparently — are quite a problem for some manufacturers!

The blades, themselves, are wrapped in not one, but two, pieces of wax paper. One with the logo and one transparent. Held together without glue, even!

So, that unassuming package actually had one of the most competently wrapped blades of all I have tried.

The blade, itself, proved to be quite sharp. Not Feather sharp, but still quite a bit sharper than most other blades I tried.

The resulting shave is decent, but far from superb. It provides a perfectly competent shave without too many cuts or too much burning of the skin. Yet, still, there was some irritation and it did draw a bit of blood whereas a week old Dorco had not.

And after nearly a week of using the blade, it has held the edge competently, too.

The only real ding against the blade is that the little box doesn’t provide a means of disposing of the blades as do some other brands.

That word… competent… has come up often in describing this blade. Apt, too, as it really is a competent blade. Given the impression offered by the packaging, my surprise was the result of discovering a perfectly serviceable blade inside. The two other blades — Personnas and 7a.m. — in similar packaging were awful!

If Sharp Stainless were all I could find, I would have no complaints!

Using malloc to Debug Memory Misuse in Cocoa

Sunday, January 10th, 2010

Every few months, there is a discussion on cocoa-dev or a question on stackoverflow.com that basically boils down to “I have a leak or over-release and I can’t use Instruments to debug it. Help?”.

Quite often, the questioner can actually use Instruments just fine, but simply lacks the know-how or hasn’t tried in a while and doesn’t realize that Instruments has improved significantly with each release of the developer tools. No, really, Instruments is a fantastic tool and I use it whenever I can; what you see below is for the exceptional case, not the norm.

There are cases where using Instruments is either inconvenient or impractical. Namely, trying to track down an intermittent crasher or trying to gain insight into memory leaks over a long running session will create a prohibitively large dataset for Instruments to process (Instruments allows for much more detailed analysis of the object graph and this analysis loads a lot more data than the tools I’ll demonstrate below).

Thus, it is helpful to be familiar with the rather powerful set of tools available from the command line and within the debugger.

Almost always, you are going to want to enable a bit of additional data via the malloc infrastructure. Have a look at the malloc(3) man page. There is an entire section devoted to ENVIRONMENT variables and there are a handful of extremely useful variables!

First and foremost, you are almost always going to want to use MallocStackLoggingNoCompact. When enabled, malloc stack logging writes the stack trace of every allocation and free event to an efficiently compact binary file in /tmp/ (it used to be in memory and, thus, used to be a great way to exhaust heap. No longer!!). Unfortunately, it doesn’t record the retain and release events, but simply knowing where the object was allocated is generally quite useful (it is generally relatively easy to track down who retained an object once you know which object it is). Under GC, you can set the AUTO_REFERENCE_COUNT_LOGGING and CFRetain/CFRelease events will be logged to the malloc history.

You can then use the malloc_history command line tool to query for all events related to a particular address in memory.

While malloc_history requires that the process still exists, it doesn’t have to be running! If you run your app under gdb, you can still use malloc_history to query the application even when it is stopped in the debugger!

Speaking of gdb, you can use the info malloc command in gdb to query the same information. Under GC, the info gc-roots and info gc-referers commands can be used to interrogate the collector for information about the connectivity of the object graph in your running application.

If you enable zombies via setting the NSZombieEnabled environment variable to YES, the address spewed in the error message when messaging a zombie can be passed directly to malloc_history.

The leaks command line tool scans memory and detects leaks in the form of allocations of memory for which the address to that memory is not stored anywhere else within the application. The leaks tool has been vastly improved in the Snow Leopard release of the Xcode tools; it is much much faster and spits out false positives almost never. It is still possible to have a leak that leaks cannot detect, of course. And, remember, even if you can still reach memory, it is still a total waste if you never use that memory’s contents again!

So, that is a brief summary of the state of command line memory debugging on Mac OS X as of Snow Leopard. Of course, that’s just a bunch of words. How about an example?

Read the rest of this entry »