Breakfast was good, and my ride in went well. There'd been some brief overnight rain, and the temperature was down to something more comfortable.
The first talk was a bunch of statistics about kernel development--rate of change (high!) number of developers (lots, and increasing!). The data was kind of fun to look at, but didn't seem very well thought-out to me; for one thing, it wasn't entirely clear what they were trying to figure out, or how the numbers might help. And for another, I felt like it fell into the usual quantity-over-quality trap: if they'd spent more time carefully examining a few randomly chosen examples, instead of trying to collect huge gobs of (probably inaccurate) aggregate statistics, they might have had more interesting results.
There was a giant chart on the back wall of the room with a bubble for each kernel developer and lines connecting any two kernel developers that had consecutive sign-offs on some patch. They were collecting autographs next to each developer's name. I found mine, but couldn't follow any of the lines--the chart wasn't really printed with sufficient resolution.
After that there was an interesting talk by someone who'd examined performance of some big "enterprise" (I think that means Oracle) workloads on a machine with a lot of processors--four quad cores. The problem here is that access to the computer's RAM is really slow, so each processor has to have a cache of recently used memory. But then those caches have to be kept consistent--if a write to memory on one processor is followed by a read of the same address on another processor, then you have to make sure the read sees the new value stored at that address. This is slow, so if it happens a lot, performance can decrease dramatically. So you try to make sure that different processors are mostly working on different parts of memory, instead of bouncing around the same piece of memory from one cache to another. The speaker seemed to have some quite nice tools that would show exactly which pieces of which data structures were getting bounced around to much, and which code was responsible.
I didn't really pay as close attention to anything else that day.
Dinner was at a nearby pseudo-Irish pub, with a few selinux ("security enhanced linux") and NFS people. There's some interesting work to be done to make the two systems play well together. By the end of dinner I felt like I understood at least a few basic things about selinux.
I couldn't get my laptop to work with the conference wireless network, but found some ethernet ports and managed to get some routine work done in gaps during the day.
I was idly hoping to make it to one of the Ottawa Jazz Festival events in the evening, but by the end I was tired and ready to go back.
And the ride home was fine again. I'm feeling a little more comfortable on the rental bike.