My thoughts on the future of LMP1-H, after the 2017 24 Hours of Le Mans

It’s been a day since the 24 Hours of Le Mans, and I’ve finally put my thoughts together on it. It was something else, nothing like I’ve ever seen before.

Before the 24 Hours of Le Mans, predictions were that Toyota would walk away with the race on both pace and reliability. Predictions were that the all-new and huge LMP2 field would have failures en masse, having shown mediocre reliability in the lead-up to Le Mans. And, fears were that the GTE-Pro field would have poor balance of performance, allowing someone to run away with it. Continue reading

Flashing linear flash cards – quick notes

I recently bought an HP OmniBook 430 that needed a system ROM card, which has to be on a linear flash card.

Creating such a card, however, was a bit of an ordeal, so I’ll quickly document what exactly I did.

Ultimately, I used a ThinkPad 365XD, but the card I used would’ve been usable on most any laptop with PCMCIA. (I tried a Fujitsu Lifebook P1620, as well, but had trouble with finding sufficiently old Linux that worked with its hardware… although now I think I could have used something more modern, now that I know what I was doing wrong.) I used a Viking Technology 24 MiB card, that was an Intel Value Series 200 card. This is a 3.3/5 volt read, 3.3/5 volt program card, using Intel StrataFlash, or what we’d now refer to as MLC flash. Complicating matters, this card did not have attribute memory, and was completely blank, meaning that many tools couldn’t automatically figure out what it was. (I’m still not sure if it’ll actually work in the target device, partially because of being Value Series 200 – if not, I’ve ordered a Series 2+ card, which is likely a better choice anyway – but my goal here was to successfully program it, and that’s definitely been done.) And, for software, I used Damn Small Linux 4.4.10, plus the DSL MTD kernel modules from here.

In a root terminal, I issued the following commands, after saving mtd-modules.tgz in /home/dsl/.

cd /
tar -xzf mtd-modules.tgz
modprobe pcmciamtd buswidth=2 force_size=24 vpp=50 setvpp=1
modprobe mtdchar
rm /etc/pcmcia/config
cp /KNOPPIX/etc/pcmcia/config /etc/pcmcia/config
nano /etc/pcmcia/config

(OK, in reality, I issued a bunch of other commands, but those are the ones that were actually important, I think.)

At this point, I added two lines near the beginning of the file:

device "pcmciamtd"
  class "memory" module "pcmciamtd"

Then, I changed the binding of Anonymous Memory from memory_cs to pcmciamtd, later in the file.

After saving that, killing cardmgr, and relaunching it, I found that /proc/mtd listed the device, and dmesg showed that it was configured (and that CFI had successfully figured out what kind of flash chip it was working with)… but there was no device node. Apparently MTD didn’t actually make nodes automatically, back then, so a quick mknod /dev/mtd0 c 90 0 took care of it. At that point, I could simply cat the firmware image to /dev/mtd0, and it flashed it successfully, which could be verified by ejecting the card, reinserting it, recreating the device node, and running cat on that device node.

Hopefully, if someone needs this, it’s available as a reference now.

Dieselgate and CO2 emissions

In case you’ve been living under a rock, Volkswagen was caught cheating on emissions testing with their 2009-2016 diesels. Recently, a proposed settlement including a buyback program (as well as, potentially, a fix) has been announced for the 2009-2015 2.0 liter vehicles. One concern, however, is the CO2 emissions impact of this buyback – both in terms of manufacturing emissions, and in terms of fuel consumption.

Continue reading

MPG is bullshit

That’s right, MPG is bullshit (and along with it, MPGe).

I don’t mean that the emphasis on improving fuel efficiency in personal transportation is bullshit, to the contrary, it’s one of the most important things we can do today. Similarly, I see the value in representing the efficiency of non-gasoline vehicles in a way that translates to gasoline units of energy – it helps put things into context for an efficiency-minded buyer.

The problem with MPG is that it, as a metric for measuring vehicle efficiency, is terrible at representing that efficiency in an intuitive way, and as a result, has discouraged improvements in efficiency in the vehicles that need it the most, and caused efforts to be directed to vehicles that need it less. To illustrate this, I’ll create a scenario.

You have two older vehicles that you drive about equally, one that gets 15 MPG (probably a full-size pickup or a large SUV), and one that gets 30 MPG (probably a compact car). You’ve got the funds to replace one vehicle, and you want to get something more fuel efficient, without losing capability – so you’re looking at either another full-size pickup or SUV, or another compact car. You look at the vehicles that are available, and see that you can get a full-size truck or large SUV that gets about 20 MPG. Alternately, you can get a car that gets 50 MPG nowadays, and it’s even a fair bit bigger than a compact. Which should you buy, to reduce your fuel consumption by the most?

Intuitively, you’d get the car – it gets 20 MPG better than your car, and it’s 67% better than your existing car on MPG. The truck only gets 5 MPG better, and only 33% better MPG.

And this gets into why MPG is bullshit – MPG determines how far you go on a fixed amount of fuel, and you’re not driving for a fixed amount of gallons, you’re driving a fixed amount of miles. In Europe, the standard (at least outside of the UK, anyway) is to report fuel economy in terms of liters per 100 kilometers. It answers the question of how much fuel it takes to go a fixed distance, instead of how far you can go on fixed fuel. Metric system issues aside, I’ll illustrate how this is a superior system for representing efficiency, using gallons per 100 miles – the familiar units in the US.

Under the gallons per 100 miles system, your truck is now rated for 6.67 gal/100 mi, and your car is now rated for 3.33 gal/100 mi – the conversion is merely 100 divided by the MPG. And, the new truck and new car are rated for 5 gal/100 mi and 2 gal/100 mi respectively. So, in 100 miles, the new truck uses 1.67 gallons less fuel over 100 miles, whereas the car only uses 1.33 gallons less fuel over the same distance. Upgrading the truck reduces your fuel consumption more than upgrading the car, even though the intuitive ways of looking at MPG (numeric or even percentage improvements in MPG) make it look like the car is the better option.

Ultimately, because MPG as a measurement is relatively insensitive to even large improvements in efficiency in inefficient vehicles, while magnifying minor improvements in efficiency in already efficient vehicles, it’s arguably hurt the American automotive marketplace. The American automotive market is one that buys plenty of large, inefficient vehicles for various reasons, and in those vehicles, if a consumer sees a “mere” 1 or 2 MPG difference between two models, they may be less inclined to take a more efficient option, even though it would save a significant amount of fuel. Conversely, consumers may prioritize replacing already efficient vehicles with vehicles that are only slightly more efficient, because of a large difference in MPG.

It’s worth noting that on, the US government’s website for information on vehicle fuel economy, in addition to the MPG (or, for electric vehicles, the MPGe) figures, they list gal/100 mi and kWh/100 mi figures (in smaller print, however, leading with MPG or MPGe), as well as allowing users of the site to have figures displayed in either gal/100 mi or l/100 km. I applaud them for this much, but I’d personally like to see MPG abolished altogether, in favor of reporting efficiency in gal/100 mi (l/100 km is just asking too much, especially because fuel’s sold in gallons and distance is measured in miles in this country) as the primary method of reporting for liquid fueled vehicles, as well as on the Monroney sticker that’s on all new cars.

Windows 10 DPI scaling and window positioning issues on laptops

If you’re using a Windows 10 laptop at anything other than the default scaling factor for your display, you may encounter an issue where closing the lid causes your window positions and sizes to be forgotten. I discovered this on my MacBook Pro Retina, which I run at 100% scaling (the default is 200%). This may also apply on Windows 8.1, but I haven’t tried it on this hardware.

In Windows 8.1, Microsoft introduced a new model of handling DPI scaling, that allowed different monitors to have different scaling factors. This is useful for situations where you’re running multiple displays of vastly different density, as it’ll make applications appear roughly the same size on all monitors.

However, at least in Windows 10, there’s a problem with this. Windows detects displays when they’re attached, and determines a proper default DPI and applies it. Once that is done, it applies your preferred scaling factor to the display. The problem with this is that closing the lid on a laptop effectively detaches the display, and opening it back up causes redetection. There’ll be no problem if you’re happy with the default scaling factor, but if you’re running a non-default scaling factor, this can cause huge problems. Apps can get stuck in the old scaling factor, windows will be rearranged, and windows will be resized. A workaround for this problem (if you don’t need different DPI for each monitor) is to disable Windows 8.1 DPI scaling, which on 8.1 could have been done by checking the “Let me choose one scaling level for all my displays” checkbox. That checkbox, however, is no longer available in Windows 10, but the registry key that it changed is still available, at HKCU\Control Panel\Desktop\Win8DpiScaling. Change it from 0 to 1, reboot, and now all displays should have the same DPI, and closing your lid won’t change your window layout.

Obviously, there’s an underlying bug in display detection that Microsoft needs to fix (as the feature that I’ve disabled actually is a useful one). However, for my case, where I’m happy with all displays running at the same scaling factor, the old way of multi-monitor scaling (as used in Windows 98 (the first version with official multi-monitor support) through 8.0) works perfectly fine.

Quick guide on upgrading a WinBook TW700 to Windows 10

It’s rather tricky to get Windows 10 onto a WinBook TW700, between WIMBoot’s inefficiency, the inability to delete things from the preload, and the limited storage available on a TW700, so I thought I’d write this quick guide on how to get the device updated. These instructions should work on any WIMBoot device – or, for that matter, any Windows 8.1 Update (and possibly earlier versions of Windows, I’m not sure if Win10 will run from 8.0 or 8.1 pre-Update) device that’s short on space. Please note that your profile and apps will NOT be migrated – I’d use User State Migration Tool, from Windows 10’s Assessment and Deployment Kit, to save profiles, and then restore them after you have the device reloaded. I’ve not used USMT in quite a few years, though.

Following this procedure will result in the loss of all data and applications on the device, and I will not be held responsible for data loss as a result. You are responsible for ensuring that you can get everything restored.

Here’s what you’ll need:

  • Two USB thumb drives of 8 GiB or larger capacity (one of these may not be necessary)
  • A MicroSD card or a third USB thumb drive of 8 GiB or larger capacity
  • A USB hub, keyboard, and mouse (shouldn’t be necessary, but it’s useful in case something goes wrong)

Here’s the procedure that I followed in a nutshell:

  1. Create a USB recovery drive and put it in a safe place. This will allow you to go back to the factory preload at any time. This is a good practice even if you plan on staying on Windows 8.1.
  2. Reset your PC (Microsoft directions, go to “Remove everything and reinstall Windows”). This is necessary to get free space back.
  3. When the Out-of-Box Experience comes up, connect to a wireless network, but do not log in using a Microsoft account (you don’t want any OneDrive data downloaded), and disable Windows Update (you do not want updates taking up the storage space you just freed).
  4. Use the Media Creation Tool on another Windows machine to create a install USB drive for Windows 10 Home 32-bit English (US), then connect that USB drive to the tablet and run setup.exe on it. You may not need to do this, it may be possible to use the Media Creation Tool’s ability to update the computer it’s running on, but I played it safe.
  5. The Windows 10 installer will mention that you need more space to install. Either use a USB hub to connect another thumb drive, or insert a MicroSD card, and select that drive in the installer.
  6. I personally chose to change the settings that would have migrated my profile and apps, and chose to only install the OS. This one’s up to you, though.
  7. Proceed to wait a while, while Windows 10 is installed. There may be a few points where the device freezes during boot, just shut the device off by holding the power button down, then release the power button, wait 5 seconds, and press it again, it will proceed normally.
  8. Go through the Out-of-Box Experience and adjust settings to your liking. Once this is complete, the device may be sluggish for a while, as it’s performing a lot of background tasks (updates, driver installations, and the like). Let it complete these before continuing.
  9. Delete your previous version of Windows. Why keep it around, when it’s just the preload missing the software that came with it, and you’ve got a thumb drive with the USB recovery drive?
  10. Somewhere along the way, a driver update will have happened, and Windows will have decided to run at 125% zoom. If you like this, leave it alone. If you don’t, go to Settings, System, Display, and change the size of text, apps, and other items to 100%. Do note that Modern UI apps are rendered smaller than in Windows 8 (and currently don’t appear to respect the system scaling setting), and Universal apps tend to have smaller UI elements than Modern UI apps did in Windows 8. However, Win32 apps are rendered the same at 100% as in Windows 8. This one’s up to personal preference, really.

That’s all there is to it. (Well, there may need to be a Bluetooth driver update, I haven’t checked that fully yet…) There’s still some glitches in Win10 (the on-screen keyboard only really works the way it did before when in a Universal app – Modern UI apps at 100% don’t quite behave right, and Win32 apps don’t move out of the way at all), but generally, things should work. And, it’s faster, lower RAM usage, and lower disk usage than before (Win10 has a much better compression mechanism).

WinBook TW700 first impressions, survival guide

So, Micro Center is selling Windows 8.1 tablets for $60. No, really. That includes a Windows license, a quad core Atom (that’s right, this isn’t even RT), an IPS display, and a friggin’ Office 365 Personal license (even with rights to install on a desktop or laptop)! Now, it does only have 1 GiB RAM, and worse, only 16 GiB of eMMC, so there were corners cut. However, even with those limitations, the price really does seem a bit too good to be true.

Then again, it’s only $60, and Micro Center does take returns (and there are plenty of open box units for $48, although I strongly suggest avoiding those for a couple of reasons), so… I ultimately couldn’t resist (if nothing else, it’ll be a decent device for running the excellent VCDS Volkswagen diagnostic tool by Ross-Tech), and I’m typing this post on it. Continue reading

The latest incarnation of my mainframe

So, I finished resurrecting my mainframe the other day.

When I was in Kansas City for KansasFest, I bought an RS/6000 7011-250, which uses the smallest chassis capable of holding 32-bit MCA cards that IBM ever produced – for comparison, I’d say it’s about the size of a Quadro 610 or Power Mac 6100, albeit a bit deeper. (It was also the first PowerPC machine ever produced (predating the PowerPC Macs by a few months), using a 66 or (in my machine) 80 MHz CPU.) This allowed me to physically downsize the machine significantly with minimal loss in functionality.

I installed the P/390 card set and installed AIX while at KansasFest, but was unable to get the mainframe actually running for several reasons. Ended up putting the project on hold for various reasons (including building the Mimeo).

Lately, decided to get back at it. Continue reading