### Gerschgorin Disks

Lately I’ve been reviewing my linear algebra and was reminded of an interesting result that we only briefly touched on in my first graduate linear algebra class: the Gerschgorin Disk Theorem. It’s interesting because it gives a straightforward way to bound the locations of the eigenvalues of a matrix in the complex plane in terms of its diagonal entries and row sums (not including diagonal entries). The diagonal entries determine the centers of the disks, while the radius is determined by difference in magnitude between the diagonal entry and its corresponding row sum (not including the diagonal). Perhaps I shouldn’t be surprised that what may seem like arbitrary quantities — row sums and diagonal entries — tell us quite a bit about the nature of matrix, but I still do.

Not finding a simple function to visualize these disks in Mathematica, I implemented my own:

``````GerschgorinPlot[A_] :=
(* Store the number of rows in A *)
With[{n = Dimensions[A][[1]]},
Show[
(* Plot the eigenvalues as points *)
ListPlot[{Re[#], Im[#]} & /@ Eigenvalues[A], AxesOrigin -> {0, 0},
AspectRatio -> Automatic, PlotRange -> All,
PlotStyle -> PointSize[Medium]],
(* Create a translucent gray disk for each row *)
{Graphics[{EdgeForm[Thin], GrayLevel[0.1, 0.1],
Disk[{Re[#1], Im[#1]}, #2]}]} & @@@ Thread[{
(* Disk centers *)
Table[A[[i, i]], {i, 1, n}],
Table[Total[Abs[A[[i]]], i] - Abs[A[[i, i]]], {i, 1, n}]}]]]``````

It plots the eigenvalues as points in addition to the Gerschgorin disks. Here’s a sample of the output for a slightly perturbed 5th-order identity matrix:

I’m fairly pleased with the result, but will continue to tweak it. If you prefer MATLAB, you might have a look at this implementation.

Posted 27 January 2014

### GL_TEXTURE_CUBE_MAP_SEAMLESS on OS X

One of the nice features included in the OpenGL 3.2 Core Profile is seamless cube map filtering, which (as the name indicates) helps reduce seams that can appear from sampling near the edges of a cube map face. It’s easy to enable:

``glEnable(GL_TEXTURE_CUBE_MAP_SEAMLESS);``

So I enabled it on a Mac application I’m developing, to great effect on my development platform (ATI 4870). However, I soon discovered that when running the same application on an older Nvidia 9600, the results were quite different. In fact, enabling seamless sampling would made the app unusable — only a black screen would be rendered, even when using a shader program that had no cube map samplers. While trying to reduce the problem to a simpler test case, I stumbled upon a very useful snippet of code:

``````GLint gpuVertex, gpuFragment;
CGLGetParameter(CGLGetCurrentContext(), kCGLCPGPUVertexProcessing, &gpuVertex);
CGLGetParameter(CGLGetCurrentContext(), kCGLCPGPUFragmentProcessing, &gpuFragment);``````

What this does — in case it isn’t already obvious — is check to see if GPU (e.g. hardware) processing is enabled for vertex and fragment processing. I’ve sometimes wondered how to do this (OpenGL Shader Builder displays this information). So now when I enable `GL_TEXTURE_CUBE_MAP_SEAMLESS` I use this check for software fallback and disable it if so. Strangely, it’s the vertex processing that returns 0 in this case.

Posted 26 April 2012

### Saint Zuck

This is why I am not on Facebook.

### Learn to X

A perceptive take on a recent tech meme that “scripting is the new literacy”:

I appreciate where they’re coming from. I can, from a certain perspective, agree with the argument. But, let’s not kid ourselves, literacy is the new literacy. The ability to read, comprehend, digest and come to rational conclusions — that’s what we need more of. We don’t, as a society, need more people who have the mechanical knowledge to turn RSS feeds into Twitter spam. We don’t need anything more posted to Facebook, we don’t need anything we photograph to appear on Instagram and Flickr. If “scripting” is the new literacy then we’ve failed…

Scripting isn’t the new literacy, it’s the new tinkering with the engine, the new re-wiring the house. The new DIY for the digital age.

Echos my own immediate reaction.

### Not smart enough, by definition

Came across another choice Kernighan quote today:

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.

Unfortunately, I don’t have any context for his comment.

### Antialiasing in SketchUp on OS X

If you’re a Mac user who works with 3D applications — SketchUp in particular — you’ve probably figured out by now that there’s nothing equivalent to the Nvidia/ATI control panels on Windows for tweaking graphics driver settings, e.g. forcing the use of multisampling/antialiasing. Individual applications have to enable antialiasing on OS X, and many don’t, SketchUp included. Even so, SketchUp does have a hidden (and unsupported) option for enabling the feature:

``defaults write com.google.sketchupfree8 "SketchUp.Preferences.AAMethod" -int 4``

The last number is of course the number of samples; this tip is derived from this question on the SketchUp forums. This may not work on all Macs, and may make SketchUp less stable, but I’ve found it to work pretty well in practice.

Update 11/12/13: This no longer works in the current versions of SketchUp — the bundle identifier has changed and changes via `defaults` appear to be overwritten at startup. I’d suggest using the method originally described in the SketchUp forums, i.e. running the following from the Ruby console:

``Test.set_AA_method(4)``

Posted 30 March 2011

### The Cognitive Style of UNIX

Fascinating analysis of the effects of software interfaces — command line vs GUI — on habits of thought, and the kinds of thinking that each promotes. The discussion is couched in terms of internalization vs externalization, which I think is a helpful framework.

Internalization subjects solved the problems with fewer superfluous moves, thus with greater economy…

And particularly interesting is the difference in dealing with interruption:

…after the interruption, internalization-subjects kept improving, while externalization fell back… internalization-subjects continue to work on base of the plan-based strategy as they did before, while externalization on the other hand performs worse after interruption. They fell back depending on the interface, having a less elaborated plan.

### A Garbage Heap of Ideas

I recently read Peter Siebel’s Coders at Work and thoroughly enjoyed it. One of the interesting themes from the interviews is a consistently negative opinion of C++. While I work with C++ on a daily basis, I find myself in agreement (despite impressive demonstrations of its expressive power). Ken Thompson’s criticisms strike to the heart of the matter:

It does a lot of things half well and it’s just a garbage heap of ideas that are mutually exclusive. Everybody I know, whether it’s personal or corporate, selects a subset and these subsets are different. So it’s not a good language to transport an algorithm—to say, “I wrote it; here, take it.” It’s way too big, way too complex. And it’s obviously built by a committee.

C++0x can make it easier to live with, but cannot redress these fundamental concerns.

### Monuments to Wishful Thinking

Nicholas Carr in an aside commenting on Readability, Instapaper, etc.

At the very least, they reveal a growing awareness that the web, in its traditional form, is deeply flawed as a reading medium, and they suggest a yearning to escape what Cory Doctorow has termed our “ecosystem of interruption technologies.” What remains to be seen is how broadly and intensively these tools will actually be used. Will they really mark a change in our habits, or will they, like home exercise machines, stand as monuments to wishful thinking?

I thoroughly recommend his blog.

### The Semicolon Wars

A well-considered take on the proliferation of programming languages:

For ethnologists, linguistic diversity is a cultural resource to be nurtured and preserved, much like biodiversity. All human languages are valuable; the more the better. That attitude of detached reverence is harder to sustain when it comes to computer languages, which are products of design or engineering rather than evolution. The creators of a new programming language are not just adding variety for its own sake; they are trying to make something demonstrably better. But the very fact that the proliferation of languages goes on and on argues that we still haven’t gotten it right.

Via LtU.