MacBook, defective by design banner

title:
Put the knife down and take a green herb, dude.


descrip:

One feller's views on the state of everyday computer science & its application (and now, OTHER STUFF) who isn't rich enough to shell out for www.myfreakinfirst-andlast-name.com

Using 89% of the same design the blog had in 2001.

FOR ENTERTAINMENT PURPOSES ONLY!!!
Back-up your data and, when you bike, always wear white.

As an Amazon Associate, I earn from qualifying purchases. Affiliate links in green.

x

MarkUpDown is the best Markdown editor for professionals on Windows 10.

It includes two-pane live preview, in-app uploads to imgur for image hosting, and MultiMarkdown table support.

Features you won't find anywhere else include...

You've wasted more than $15 of your time looking for a great Markdown editor.

Stop looking. MarkUpDown is the app you're looking for.

Learn more or head over to the 'Store now!

Saturday, August 08, 2020

I'm not sure why I haven't been exposed to this earlier in my life, but it's definitely a clear example of brillance. (Reminder: My definition of brilliance is any solution to a problem that, once seen, can't be unseen. And it makes you wonder why you never thought of that yourself. Which isn't to say you would've, but that the solution fits that perfectly.) 

Let's cut to the chase: Studying analog computers today I ran into (via Wikipedia) an old Navy doc from 1944 describing an analog computer they used titled, Basic Fire Control Mechanisms, Ordnance Pamphlet (OP) 1140, which has been nicely scanned and presented in a pdf version.  Lookit how these things work. Simplicity itself, but I'd never thought of making something like it. 

Easy, right? If you want to multiply by 36, you have a gear where X teeth of movement (say just enough to move a dial so that what's in the output window changes from 1 to 2) turns a second shaft 36 times X teeth. Then you check the number that shows in the output window (also listed on a dial) that's regulated by that shaft.

In this case, the Navy wanted to set up a computer where sailors would enter the same number of specific variables for each calculation and have the machine compute the values needed to set and fire shells from their artillery.

That's pretty cool. No, more to the point, that's brilliant. No energy needed. Easy to repair, all things considered. Doesn't require any insanely specialized knowledge to work on or with. Not real flexible -- you're give or take doing the same calculation each time -- but in this case, who cases? As the OP says...

The basic mechanisms described in this book were especially developed over a period of about 30 years to do a highly specialized job. That job is to solve mechanically the mathematics for surface and anti-aircraft fire control. These basic mechanisms make the necessary computations to point the guns and set the fuzes to hit fast moving targets with shells fired from the deck of a ship which is moving, pitching, and rolling. To aim the guns correctly under these conditions about 25 things must be taken into account all at the same time. These include target speed, climb, and direction; target range, elevation, and bearing; pitch and roll; and initial shell velocity.

If the enemy were to announce six or eight hours beforehand just where the target would be at a particular instant and just how it would be moving, a lightning mathematician would be able to calculate where to point the guns to hit it at that one instant. But, the results would be good only for one instant.

Now I have my doubts about how "lightning" the mathematician would have to be, since we're just doing an easily, if tediously, delineated set of multiplication/division/logarithmic (?) operations each time (how many math questions can you answer in "six or eight hours"? Um, lots), but point taken. Pretty cool.


But why [do you care]?

Why did I run into this today? One pastime I've been pouring waaaay too much time into recently is the study of DIY headphone amps and cassette players. And one of the important parts of any (well, most any) headphone amp is its "op amp". The op amp is the piece that, when fed a little juice, makes the tiny electric current that's created by the magnets recorded into [sic] your tape as they pass your cassette player's read head loud enough to hear.

If we read our canonical work, Op Amps for Everyone, we learn that the name "pop amp" is short for "operational amplifier", and they're, in a sense, a new twist on the shaft-based analog computers we just saw the Navy used for aiming shells in the Forties.

The heart of the analog computer was a device called an operational amplifier because it could be configured to perform many mathematical operations such as multiplication, addition, subtraction, division, integration, and differentiation on the input signals. The name was shortened to the familiar op amp, as we have come to know and love them. The op amp used an amplifier with a large open loop gain, and when the loop was closed, the amplifier performed the mathematical operations dictated by the external passive components.

Does that make sense? You pass in a voltage and the operational amplifier -- which, it should be noted, requires a power source to perform its operation -- cuts it by two or multiplies by 7... or 36!... or whatever. It is, in effect, just a way of moving from one setting on one shaft through "electric gears" to another.

In an amplifier, we take in a current (?) and multiply it to produce more volume (or is that gain?). Different op amps, like different gears, multiply by different amounts. That multiplication here is measured in decibels, a logarithmic scale 

Here's a list of op amps that work reasonably well in a CMoy DIY headphone amp (for more on the CMoy, sort of "the" famous DIY amplifier, read the original post here, learn to build here, buy a kit here, or learn about alternatives). Note that each has some measurements of the power you've got to put into the op amp to get a corresponding decibel gain for your sound; one this page, he's listed Vmin, 0.5V into 33โ€‰ฮฉ and Vmin, 2.0V into 330โ€‰ฮฉ, so how much power (how many teeth in the gears?) is necessary for a specific decibel gain (multiplication factor)?

Anyhow, that's a long-winded way of saying that your old cassette player was very likely a computer. An analog computer capable of just one calculation, but that's all you needed! From the docs for the Elenco Electronic's AK-200 Cassette Player Kit (or the per-soldered AK-250), we see what the goal of your player's integrated circuit was...

To maintain a flat frequency response over the full audible range, both the high and low frequencies must be given a boost... It is therefore possible to boost the high frequencies during the recording process without saturating the tape. This is called pre-equalization.

It is not practical to fully boost the low frequencies during the recording process. It is therefore done by boosting the low frequency response of the playback amplifier. This called post-equalization. The National Association of Broadcasters (NAB) has set a standard response curve for playback amplifiers. In general, pre-recorded tapes are recorded so that the response is flat over the audible range when played back through an amplifier having this response.

...

As shown on the schematic diagram (Section 13), each amplifier consists of a pre-amplifier and driver with a volume control circuit between them. The playback signal from head A is input to the pre-amp on pin 3. The pre-amp has a gain of 30dB (about 32 times) at 1kHz. Resistors R2, R5 and R6 and capacitors C2 and C4 are placed in the feedback circuit of the pre-amp to provide the NAB standard frequency response.

Emphasis mine, as usual. 

Does that make sense? Because of limitations of the cassette tape medium, you need to boost some frequencies. That calculation is what all the innards of your player are for (aside from all the buttons and mechanics for engaging the capstan motor and all that): They're there to translate the feed from your tapes using the "NAB standard" formula for boosting the signal to a "flat frequency response".

Neat!

Some more cMoy stuff:


Btw, hummingbirds chirp, sometimes when feeding. TILx2.

Labels: , , , ,


posted by ruffin at 8/08/2020 10:31:00 AM
Wednesday, January 15, 2020

Here's Today Adventure in Enjoyable AngularJS...

I ran into some code from a project that looked give or take like this...

namespace.directive('caNavigation', function () {
    "use strict";

    return {
        restrict: 'E',
        scope: {
            inLinks: "@",
            collapseNav: "="
        },
        // ...
       
        controller:
            ["$scope", /* more jive */],
            function ($scope /* more jive */)
        {
            $scope.$watch('inLinks', function (value) {
                // omgwtfbbq?
                // ...


Well, that's @ and = stuff is pretty cryptic, especially since I hadn't used a directive in AngularJS before (Angular 2+, sure, but still learning ye olde AngularJS -- going backwards, I know. Don't forget I had my first run-in with Perl in 2014 [sic]).

Let's see what ye olde docs say:

[Having to have a separate controller to change direct scopes] is clearly not a great solution.
What we want to be able to do is separate the scope inside a directive from the scope outside, and then map the outer scope to a directive's inner scope. We can do this by creating what we call an isolate scope. To do this, we can use a directive's scope option:
...

Let's take a closer look at the scope option:
//...
scope: {
customerInfo: '=info'
},
//...
The scope option is an object that contains a property for each isolate scope binding. In this case it has just one property:
  • Its name (customerInfo) corresponds to the directive's isolate scope property, customerInfo.
  • Its value (=info) tells $compile to bind to the info attribute.

Good heavens, AngularJS. Not the most discoverable. Seems like in AngularJS in general there's lots of magic stringing. Perhaps I shouldn't be surprised, but that doesn't mean I like it. (Here I mean largely inline array notation, which is too hipster by half. I mean, I get that you don't what to lose information when you minimize, but then how about pass real references, not refer to objects by their original names? Sheesh.* )

But let's go a little deeper so that we know what the @ symbol means too.
The 'isolate' scope object hash defines a set of local scope properties derived from attributes on the directive's element. These local properties are useful for aliasing values for templates. The keys in the object hash map to the name of the property on the isolate scope; the values define how the property is bound to the parent scope, via matching attributes on the directive's element:
  • @ or @attr - bind a local scope property to the value of DOM attribute. The result is always a string since DOM attributes are strings. If no attr name is specified then the attribute name is assumed to be the same as the local name. Given <my-component my-attr="hello {{name}}"> and the isolate scope definition scope: { localName:'@myAttr' }, the directive's scope property localName will reflect the interpolated value of hello {{name}}. As the name attribute changes so will the localName property on the directive's scope. The name is read from the parent scope (not the directive's scope).
  • = or =attr - set up a bidirectional binding between a local scope property and an expression passed via the attribute attr. The expression is evaluated in the context of the parent scope. If no attr name is specified then the attribute name is assumed to be the same as the local name. Given <my-component my-attr="parentModel"> and the isolate scope definition scope: { localModel: '=myAttr' }, the property localModel on the directive's scope will reflect the value of parentModel on the parent scope. Changes to parentModel will be reflected in localModel and vice versa. If the binding expression is non-assignable, or if the attribute isn't optional and doesn't exist, an exception ($compile:nonassign) will be thrown upon discovering changes to the local value, since it will be impossible to sync them back to the parent scope.
    By default, the $watch method is used for tracking changes, and the equality check is based on object identity. However, if an object literal or an array literal is passed as the binding expression, the equality check is done by value (using the angular.equals function). It's also possible to watch the evaluated value shallowly with $watchCollection: use =* or =*attr
If you've used Angular 2+ (what a failed naming scheme, btw. Why not "Angular.IO" after the website or just NGular or something so that googling this jive would be easier?), you can already see our steady progression to banana boxes.

But for now, what a mess.
  1. If a scope property is a string that starts with @, then we'll initialize that scope property with the directive's DOM attribute that matches what comes after the @.
    • That is, this is a one-way process. Props down, and set up events if changes need to go back up.
  2. If the prop starts with an =, we're doing the equivalent of Angular.IO's banana boxing, it appears.
    • So if we change whatever's in here, the parent is changing too.
Back to our original example now... We had a cryptic (to me) $scope.$watch('inLinks', function (value)..., and now we know what that means. @ means we've got a prop that $watch can check for changes, which, in this case, means:

  • The listener is called only when the value from the current watchExpression and the previous call to watchExpression are not equal (with the exception of the initial run, see below). Inequality is determined according to reference inequality, strict comparison via the !== Javascript operator, unless objectEquality == true (see next point)
  • When objectEquality == true, inequality of the watchExpression is determined according to the angular.equals function. To save the value of the object for later comparison, the angular.copy function is used. This therefore means that watching complex objects will have adverse memory and performance implications.

Guess that works. Let's ignore the =* shallow comparison jive for now. All we've really got is one prop called inLinks that will be passed... um... "in" that we'll update whenever it changes, probably after the page finishes loading and making some resource calls. And sure enough...

<li ng-repeat="subLink in link.SubLinks|filter: {ShowInNav: true}" ...

That means once we've loaded the correct navigational links for this context, we'll start pushing them into the DOM.

And our code makes sense. Much rejoicing.



* I actually kinda like AngularJS. It's one clean step away from the inefficient, overly-engineered land of enterprise development Bjarnason discusses and that at least the social convention surrounding Angular 2+ seems to require. But there are a few stupid* conventions like this one that really do AngularJS in conceptually.

* Sorry, I've tried to come up with a good synonym, but "stupid" fits here. My earlier use of "hipster" was my kind attept at an alternative, but this stuff really feels a little too much like a first-pass solution to be in a nice, mature templating library.

Labels: , ,


posted by ruffin at 1/15/2020 05:32:00 AM
Tuesday, December 01, 2015

Clued in after reading this...

I've got a place where I want to return entities mixed in with folders in a grid, in this case a Kendo Grid. I want to have the two "types" of data mixed in display, however, which means this isn't as clean as it could be. I've got pretty typical fields for the first query, and then I'm kinda kludging the folders into the same object model, like so...

using (AssetRepository repo = new AssetRepository(AccessControlHelper.GetCurrentUserId()))
{
    var assets = repo.GetAllAssets().Select(a => new {
        id = "a" + a.AssetId,
        clientId = a.ClientId,
        name = a.AssetName,
        tags = a.Tags,
        folderId = a.AssetFolderId,
        thumbnail = "/AssetManager/ImageThumbnail/" + a.AssetId
    });

    var folders = repo.GetAssetFoldersForParentFolderId(null).Select(f => new
    {
        id = "f" + f.AssetFolderId,
        clientId = f.ClientId,
        name = f.FolderName,
        tags = "",
        folderId = f.ParentFolderId,
        thumbnail = ""
    });
    var both = assets.Union(folders);   // <<< NEAT-O, DADDY-O!

    return Json(assets);
}

I'm not absolutely sure how I feel about this as a production-worthy strategy... It's obviously taking a dog and making it quack like a duck, which is tres hipster. The problem is that Kendo Grid allows you to have hierarchies, but expects everything on the same hierarchical level to be of the same object type. (My vote is to write a custom table renderer that'd support the concept of folder levels mixed in with entities, but that's understandably not the route this contract is taking. Still, it's so often just as difficult to get a third-party library to work as you need it as it'd be to make your own widget that exactly covers your own use cases.)

Regardless, what's neat to me is that the compiler's smart enough to know these are the same anonymous types and allows you to Union them. That's cool.

(apologies for the Yoda'd blog title. SEO BABY! /sarcasmTinge)

Labels: , , ,


posted by ruffin at 12/01/2015 05:11:00 PM

<< Older | Newer >>


Support freedom
All posts can be accessed here:


Just the last year o' posts:

URLs I want to remember:
* Atari 2600 programming on your Mac
* joel on software (tip pt)
* Professional links: resume, github, paltry StackOverflow * Regular Expression Introduction (copy)
* The hex editor whose name I forget
* JSONLint to pretty-ify JSON
* Using CommonDialog in VB 6 * Free zip utils
* git repo mapped drive setup * Regex Tester
* Read the bits about the zone * Find column in sql server db by name
* Giant ASCII Textifier in Stick Figures (in Ivrit) * Quick intro to Javascript
* Don't [over-]sweat "micro-optimization" * Parsing str's in VB6
* .ToString("yyyy-MM-dd HH:mm:ss.fff", CultureInfo.InvariantCulture); (src) * Break on a Lenovo T430: Fn+Alt+B
email if ya gotta, RSS if ya wanna RSS, (?_?), ยข, & ? if you're keypadless


Powered by Blogger etree.org Curmudgeon Gamer badge
The postings on this site are [usually] my own and do not necessarily reflect the views of any employer, past or present, or other entity.