One of the nice features included in the OpenGL 3.2 Core Profile is seamless cube map filtering, which (as the name indicates) helps reduce seams that can appear from sampling near the edges of a cube map face. It’s easy to enable:
So I enabled it on a Mac application I’m developing, to great effect on my development platform (ATI 4870). However, I soon discovered that when running the same application on an older Nvidia 9600, the results were quite different. In fact, enabling seamless sampling would made the app unusable — only a black screen would be rendered, even when using a shader program that had no cube map samplers. While trying to reduce the problem to a simpler test case, I stumbled upon a very useful snippet of code:
GLint gpuVertex, gpuFragment;
CGLGetParameter(CGLGetCurrentContext(), kCGLCPGPUVertexProcessing, &gpuVertex);
CGLGetParameter(CGLGetCurrentContext(), kCGLCPGPUFragmentProcessing, &gpuFragment);
What this does — in case it isn’t already obvious — is check to see if GPU (e.g. hardware) processing is enabled for vertex and fragment processing. I’ve sometimes wondered how to do this (OpenGL Shader Builder displays this information). So now when I enable
GL_TEXTURE_CUBE_MAP_SEAMLESS I use this check for software fallback and disable it if so. Strangely, it’s the vertex processing that returns 0 in this case.
Posted 26 April 2012
This is why I am not on Facebook.
A perceptive take on a recent tech meme that “scripting is the new literacy”:
I appreciate where they’re coming from. I can, from a certain perspective, agree with the argument. But, let’s not kid ourselves, literacy is the new literacy. The ability to read, comprehend, digest and come to rational conclusions — that’s what we need more of. We don’t, as a society, need more people who have the mechanical knowledge to turn RSS feeds into Twitter spam. We don’t need anything more posted to Facebook, we don’t need anything we photograph to appear on Instagram and Flickr. If “scripting” is the new literacy then we’ve failed…
Scripting isn’t the new literacy, it’s the new tinkering with the engine, the new re-wiring the house. The new DIY for the digital age.
Echos my own immediate reaction.
Came across another choice Kernighan quote today:
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.
Unfortunately, I don’t have any context for his comment.
If you're a Mac user who works with 3D applications -- SketchUp in particular -- you've probably figured out by now that there's nothing equivalent to the Nvidia/ATI control panels on Windows for tweaking graphics driver settings, e.g. forcing the use of multisampling/antialiasing. Individual applications have to enable antialiasing on OS X, and many don't, SketchUp included. Even so, SketchUp does have a hidden (and unsupported) option for enabling the feature:
defaults write com.google.sketchupfree8 "SketchUp.Preferences.AAMethod" -int 4
The last number is of course the number of samples; this tip is derived from this question on the SketchUp forums. This may not work on all Macs, and may make SketchUp less stable, but I've found it to work pretty well in practice.
Update 11/12/13: This no longer works in the current versions of SketchUp -- the bundle identifier has changed and changes via
defaults appear to be overwritten at startup. I'd suggest using the method originally described in the SketchUp forums, i.e. running the following from the Ruby console:
Posted 30 March 2011
Fascinating analysis of the effects of software interfaces — command line vs GUI — on habits of thought, and the kinds of thinking that each promotes. The discussion is couched in terms of internalization vs externalization, which I think is a helpful framework.
Internalization subjects solved the problems with fewer superfluous moves, thus with greater economy…
And particularly interesting is the difference in dealing with interruption:
…after the interruption, internalization-subjects kept improving, while externalization fell back… internalization-subjects continue to work on base of the plan-based strategy as they did before, while externalization on the other hand performs worse after interruption. They fell back depending on the interface, having a less elaborated plan.
I recently read Peter Siebel’s Coders at Work and thoroughly enjoyed it. One of the interesting themes from the interviews is a consistently negative opinion of C++. While I work with C++ on a daily basis, I find myself in agreement (despite impressive demonstrations of its expressive power). Ken Thompson’s criticisms strike to the heart of the matter:
It does a lot of things half well and it’s just a garbage heap of ideas that are mutually exclusive. Everybody I know, whether it’s personal or corporate, selects a subset and these subsets are different. So it’s not a good language to transport an algorithm—to say, “I wrote it; here, take it.” It’s way too big, way too complex. And it’s obviously built by a committee.
C++0x can make it easier to live with, but cannot redress these fundamental concerns.
Nicholas Carr in an aside commenting on Readability, Instapaper, etc.
At the very least, they reveal a growing awareness that the web, in its traditional form, is deeply flawed as a reading medium, and they suggest a yearning to escape what Cory Doctorow has termed our “ecosystem of interruption technologies.” What remains to be seen is how broadly and intensively these tools will actually be used. Will they really mark a change in our habits, or will they, like home exercise machines, stand as monuments to wishful thinking?
I thoroughly recommend his blog.
A well-considered take on the proliferation of programming languages:
For ethnologists, linguistic diversity is a cultural resource to be nurtured and preserved, much like biodiversity. All human languages are valuable; the more the better. That attitude of detached reverence is harder to sustain when it comes to computer languages, which are products of design or engineering rather than evolution. The creators of a new programming language are not just adding variety for its own sake; they are trying to make something demonstrably better. But the very fact that the proliferation of languages goes on and on argues that we still haven’t gotten it right.
A brief, thoughtful (but by no means new) essay on the relationships of people with their tools.
The trouble begins with a design philosophy that equates “more options” with “greater freedom.”
Via The Online Photographer.