Archive for November, 2007

Cocoa Based Smoked Pork

Friday, November 30th, 2007
23 Hour Smoked Boston Butt

As many have tweeted and ‘blogged, there was a bit of Cocoa gathering at Apple this week.

It was mighty cool to hang with so many of the folks that consume our products.

Personally, I was ecstatic to see that so many were embracing garbage collection and finding great success therein.

It is really gratifying to see people run with the tools that we [all of Dev Tech] have pushed out. Damn, you folks are creative!

Anyway, a twitservation (twitter-conversation) — more a half-assed argument– with Wil Shipley combined with the awesomeness of the kitchen led me to cook up 33 lbs of smoked pork for lunch today.

I did it just a bit different this time. Namely, I cooked it slightly longer — 23.5 hours, now that I have looked at the actual wall time — and slightly cooler.

23 Hour Smoked Apples in Pork Fat

I have moved to using a probe thermometer stuck through the gap between halves of the Big Green Egg to monitor temperature. As well, I’m using a large plate setter that coincidentally raises the cooking grid to 1/4″ below the opening of the BGE.

As a result, whatever temperature I set the Stoker too, it will absolutely be the cooking temperature at the interface between fire and food. As a result, this particular pork was cooked at a lower temperature than I have done in the past in that the gradient between cooking grid and top of dome ran as about a 30 degree downward slope (cooking grid @ 230, dome at 200). Previously, the grid probe was typically 1.5″ above grid and, thus, grid temp was probably a good 20 to 30 degrees higher than I intended.

The end result was that the fat and connective tissue was fully rendered, but the cuts of meat still had a slice to them! You could cut it with a fork easily enough, but it still required cutting.

Personally, I found it to be a more pleasing and versatile product than straight up pulled pork.

As an experiment, I halved some apples and placed the halves in a pan under the pork as it cooked with the open face up. No clue what was going to happen.

The end result was a bowl made of apple skin filled with apple stew where the water had been replaced by rendered pork fat.

Universally accepted as delicious. Next time, I’ll make quite a few more and bake them into a pie with little bits of pork fat strewn throughout.

SWT Moving to Cocoa

Friday, November 30th, 2007

A few weeks ago, the SWT folks came out to Apple to learn a bit about Cocoa & Leopard.

Very early in the week, they discovered the BridgeSupport generated XML metadata that describes almost all Objective-C and C APIs on the system (and is included with user installations of Leopard).

Using that, they whipped up a tool that automatically generates a set of Java classes that mirror the Objective-C classes, with all the necessary glue bits underneath.

End result? SWT apps are now Cocoa apps. The goal is to host Eclipse on top of this version of SWT and they already have some success on that front.

Very cool.

All of this work has been pushed back into the SWT repository and they have now put out a call for help from the community.

Nature Playing With Light & Color; SF Zoo

Monday, November 26th, 2007
Peacock Tail #1
Peacock Tail #2

We went to the San Francisco Zoo for the first time yesterday. Quite the nice zoo; the animals all seem fairly content and the grounds were beautifully designed. The paths were all windy and it is quite easy to get pleasantly lost, only to find yourself in the midst of some random part of the world’s animals.

The zoo also hosts quite the population of wild birds. Lots of ducks and seagulls, of course. We also saw a couple of great blue herons wandering and flying about. Magestic birds!

As well, there were quite a few peacocks wandering the grounds. They were relatively tame. As I was walking behind one, I noticed that their tail feathers completely change color as the light shifts.

Roger & Peacock

Roger helped me herd one into a patch of sunlight and I was able to capture the amazing color shift as the angle of light striking the tail feathers shifts by only a few degrees.

Pretty awesome creature. That really is the same bird. Didn’t catch any with their tail in full display mode.

I, of course, took a bunch of pictures.

Can Ruby, Python and Objective-C Co-exist in a Single Application?

Sunday, November 25th, 2007

In short, Yes. But not without some pain.

You can grab an example of the Python and Ruby bridges working together in a Cocoa application from either this downloadable zip or from this Subversion repository (in case something actually changes).

It works. Sort of. Ironically, this likely would have worked better under the pre-BridgeSupport versions of RubyCocoa and PyObjC.

Specifically, RubyCocoa and PyObjC both assume that nothing else might have loaded the dynamic libraries that are automatically generated by the gen_bridge_metadata script. So, either bridge will quite happily attempt to load /System/Library/Frameworks/Foundation.framework/Resources/BridgeSupport/Foundation.dylib and then barf mightily when the other bridge has already loaded the same dylib.

The example is rife with silliness related to catching the resulting exceptions and ignoring them. Worse, the fallout is such that from Foundation import * doesn’t actually cause the Objective-C classes to be defined within the importing module.

There is a back door — objc.lookUpClass() — but this is yet further evidence that, at this time, mixing these two languages in a single Cocoa application is not anything more than a silly hacque (as the SVN repository subdir indicates).

What does it do?

Not much, really.

  • Start with RubyCocoa Application Template (because I can deal with brokeness in Python, I wanted to start with something working in Ruby)
  • NSApp Delegate written in Obj-C
  • Finish loading hooks used to bootstrap PyObjC/Python (with gross exception ignoring goofiness related to the dylib)
  • Bind a table view to an array of dicts where each dict has the key “name” leading to a string value bound through array controller.
  • Array controller bound through app delegate method.
  • App delegate returns array of dictionaries by calling python based NSObject subclass.
  • Python based NSObject subclass composes an array of dictionaries (all python) from a combinatio of Python strings and Ruby strings by calling an instance of a Ruby based subclass of NSObject.

Very little code. Lots of moving parts. Some gears grinding. Maybe even a gear tooth or four missing. Enjoy.

A pie-crazy thanksgiving meal!

Saturday, November 24th, 2007
Pie Crazy Thanksgiving
Thanksgiving Feast

We had an awesome thanksgiving meal over at a friend’s house. Somehow, we ended up with 7 pies for 10 people.

I made two butternut squash pies. Ben brought two walnut pumpkin pies and two closed top apple pies. Finally, Chris P. brought a sweet potato pie.

All delicious!

And that level of excess was the theme for the rest of the meal. We had green beans with fennel, salad, sweet potatoes with marshmallows on top, cranberries, and probably a couple of other dishes that I can’t remember.

And, of course, there was Turkey. Delicious oven roasted perfectly cooked juicy turkey. With stuffing.

I also brined and rosemary smoked a 20 pound pork leg. Cooked it to an internal of 145 degrees in about 4.5 hours on the BGE> The end result was just flat out stunning.

MarsEdit Full Fidelity Preview

Thursday, November 22nd, 2007
Roger Wave Watching

OK — I finally got around to actually following the instructions on this red sweater post. Ahh… much better.

Now MarsEdit actually shows me content pretty close to how it’ll look when published. This should greatly reduce the # of round trips when integrating pictures and text.

Got it mostly working in about 10 minutes. Still not quite full fidelity; flickr and google content doesn’t show up, but that is more of a feature than a bug.

Tedious, but worth it.

Wii Warning: Verify that your Miis are editable!!!

Wednesday, November 21st, 2007

My first Nintendo Wii was a lemon; it suffered from the video corruption issues related to GPU overheating and/or failure. Nintendo was extremely efficient at addressing the problem, not only replacing my Wii but also migrating all data — downloaded virtual console games and Miis — to the new Wii.

Or so I thought.

It appears that the Miis were migrated and not restored to the new system. That is, the system believes that the Miis were copied from some other system (which, technically, they were).

Why is this a problem?

Because I cannot edit Miis that I created on the original system nor can I use my Miis for new features like the Check Mii Out channel.

At this point, it appears that my only recourse is to recreate my Miis. All of them. Including losing all saved data, scores, accomplishments, etc…

Sigh.

I have a technical support ticket open with Nintendo. Hopefully, since it was their screwup, they’ll have a fix. Though, honestly, I can’t imagine sending my Wii to Nintendo just to preserve my Pro Golf rating and my 1.45kg of brain.

If you have had your Wii serviced by Nintendo and said service involved replacing the Wii, you might want to see if your Miis are really your Miis, too!

Rosemary Smoked, Garlic Infused, Leg of Lamb

Wednesday, November 21st, 2007
Rosemary Smoked, Garlic Infused, Leg of Lamb with a Side of Mint Jelly & Sesame Roasted Aspargus

While in Missouri, My mom taught me a neat trick for stuffing garlic (or other chunky spices) deep into a piece of meat. We made a roast leg of lamb on Dad’s new Big Green Egg and it was delicious.

Upon my return home, I decided to recreate the magic, so to speak.

Pictured at left is the result. Rare, garlic infused, leg of lamb. Smoked over big chunks of rosemary wood, which is evident by the beautiful red smoky color of the end piece on the far left.

I paired it with a bit of mint jelly (of course!) and some baked aspargus that had been tossed with salt, pepper, and sesame oil.

Delicious. Click on through for pics/instructions on jamming the garlic yumminess into the meat.

Read the rest of this entry »

Acorn (and PyObjC).

Saturday, November 17th, 2007

A while ago, Flying Meat released Acorn, an image editor built with a focus upon simplicity.

I have never much been into image editing. I’m simply not smart enough to use PhotoShop and don’t have the interest to invest the time necessary to overcome my mental limitations in the face of that application.

The last real image editing I did invovled 2-bit black and white icon editing on NeXTSTEP.

I do, however, have a certain passion for whole image editing. That is, I like taking photos and I like tweaking ’em a bit here and there to emphasize whatever it is that I want to capture.

So, with that said, I’m not even going to try to review Acorn beyond saying that it really is simple enough that a genre-incompetent user like me was able to launch the app and get done exactly what I wanted with very little spastic monkey style interaction with the user interface.

I.e. Acorn works. Acorn is simple. And Acorn did what I wanted with minimal fuss and, in this case, fluid integration with Aperture as its external editor.

GraphicConverter — an application that I purchased long ago, without regret — has been deleted. Whereas GraphicConverter is a more powerful application, the user experience pales in comparison to Acorn.

I would gladly pony up the $40 for a license to Acorn. But I didn’t have to.

As a way of saying thanks for my involvement in PyObjC over the years, the fine folks at Flying Meat sent me a complimentary license. Thank you. Seriously. Very much appreciated.

You see, Acorn has a plugin model and you can write plugins in either Objective-C or Python. Python modules are written via PyObjC. Brilliant. Elegant. Love it.

Is speed a problem? Not really. Most of the work in the plugins will be done by CoreImage filters or the like; Python is the configurating glue on top.

Very, very cool.

My favorite pen

Friday, November 16th, 2007
Keyboard

My favorite pen” seems to be a meme of some popularity amongst some of the weblogs I follow.

Pictured at left is my writing tool of choice.

A Core 2 Duo MacBook Pro from about a week after said systems were released. Used hard and constantly. That key wear? Normal given my utter reliance upon LaunchBar and propensity for switching apps/contexts with intent at heart.

I hate writing via scribbly things. If it involves more than 5 minutes of continuous writing, my hand cramps up and my writing becomes beyond illegible (as opposed to every other word vaguely readable).

This has long been the case. At the age of five, I told my parents that I didn’t need to write because I would have a machine that would do it for me. And, not terribly many years later, I did.

I don’t remember ever having written more than a paragraph or two out by hand with the sole [soul] exception being various romantically driven missives. Anything paper like from late elementary school on was typed.

Seriously. Handwriting. Hate it. Why the hell should I waste my time scribbling down words in a media that does not offer interactive editing capabilities and easy access? What a waste of time!