Nice article for engineers to understand something that most guitar players will intuitively know.
One of the great things about a hi-gain setup like Hendrix's is how the feedback loop will inject an element of controlled chaos into the sound. It allows for emergent fluctuations in timbre that Hendrix can wrangle, but never fully control. It's the squealing, chaotic element in something like his 'Star Spangled Banner'. It's a positive feedback loop that can run away from the player and create all kinds of unexpected elements.
The art of Hendrix's playing, then, is partly in how he harnessed that sound and integrated it into his voice. And of course, he's a force of nature when he does so.
A great place to hear artful feedback would be the intro to Prince's 'Computer Blue'. It's the squealing "birdsong" at the beginning and ending of the record. You can hear it particularly well if you search for 'Computer Blue - Hallway Speech Version' with the extended intro.
> The way you can hear the machine guns, choppers, sirens, screaming in agony…
You know, I've heard that performance so many times over so many decades that I don't have to hit a play button or even close my eyes in order to hear it. It's there inside my head when I want it to be.
And somehow I never interpreted it in that way (sirens, screaming, etc) until just a moment ago. I thought it was just a quirky little early-morning break in the familiar tune from someone who had been up way too long by that point.
And now instead of just being the quirky sounds of an impromptu guitar solo that I can recall whenever I wish, it now has unpleasant pictures to go with it.
The imagery of 1969, I remember it well. The Vietnam war was the first war that was televised. Everyone would watch the nightly news at 6:30 pm (take my word for it) and hear the choppers, gunfire and real life screams of people.
I thought it was sheer genius that Hendrix was able to subtly bring that into the national anthem which made it resonate so well with those purchasing his music. But without that background reference I never supposed that younger generations would hear it entirely differently.
> "The imagery of 1969, I remember it well. The Vietnam war was the first war that was televised. Everyone would watch the nightly news at 6:30 pm (take my word for it) and hear the choppers, gunfire and real life screams of people."
Slightly off-topic--
Before my time, but my professor* recalled to our class his experience watching a _live_ news report from Vietnam. Something shocking happened during the broadcast. As a visual-media scholar he contacted the station to obtain a copy. No go. He remarked how he never saw that footage ever again (at that time it would have been over 15 years ago). In our modern digital age it's difficult to imagine anything going live to the nation, and then disappearing.
* (Charles Chess, Introduction to Film, SJSU, c1992)
Recently, the movie "Cleopatra" was on TV. I was watching it with the sound off while I did other things.
There was one scene where Rich Burton and Elizabeth Taylor were arguing with each other. I watched their lips move, and somehow I heard Burton speaking his lines in his voice, and Taylor her lines in her voice. I had to do a double take to see that the sound was actually muted, but my mind re-created it anyway.
Going way way off topic - when those two were a couple they had a house in Puerto Vallerta, casa Kimberly thats now a hotel. I stayed there once in the late 90s and from their website it hasn't really changed since I was there. The whole time I could just imagine them being there living the hollywood getaway lifestyle. Definitely a cool place to stay - in the old town not in the resort area, and very much worth it if you get the chance. (although it does look more expensive than it was then, even adjusting for inflation).
I read your comment and immediately wondered how much of my braincells are permanently occupied with remembering music. Probably quite a lot in an absolute sense but I wonder about the percentage of storage and whether or not that could have been used in other ways. And of course then I wonder if they are stored compressed, and whether that is lossy compression or not ;)
Thousands of songs reside quite comfortably in my brain. It's rather amazing.
I can tell when a musician is lip syncing their hit song, because nobody sings a song the same way twice, and the performance exactly matched the CD version of the song.
It's all tradeoffs. I can't remember names or faces even if doing so is worth money.
Instead, I can recall the complete works of Roger Waters or Nine Inch Nails, but not the names of the songs unless I really studied that part. I can recall themes from TV shows from decades ago, but be unable to place the name of the show.
At any given time, anywhere at all, I can listen to any of at least five different covers of Fat Bottomed Girls -- and have no idea who performed any of them, and therefore no ability to share them with others.
It's an interesting way to be and it is the only way I know, but there's reasons that I'm terrible at being a DJ.
If you listen to the Woodstock soundtrack it is clear that Hendrix was on a completely different musical level than anyone else in that scene. Ravi Shankar was probably the only person there above him from a chops perspective and possibly in the expressivity department as well. But when it came to sheer inventiveness no one was close to Hendrix. I cannot imagine what it must have been like to see and hear him. It must have felt like an alien was performing.
The Who followed him, and famously destroyed their entire set in a vain attempt to be noticed.
Like a jealous plumber, worried that Kim Kardashian's "Break the Internet" photo series will take away from his appeal, hurriedly posting photos of his plumber's crack online...
I've not listened to that song much at all. I am however obsessed with Machine Gun which has all those elements and more. Maybe I'll have a re-listen to SSB.
Do it; I think the political subtext of weaving an anti-war statement into the national anthem makes it both very obvious and very elegant at the same time.
I wonder if tube harmonics modeled by solid state settings has shaped music. Of course it has; music from that era is instrument-oriented.
The discovery of feedback tones and the resulting incorporation in the musical experience — a three hour warm bank of tubes turned up to the limit with a maxxed out savant unlocking new realms of sound.
This leaves me wondering what would happen if you attached a coupling to a trumpet and ran the sound through an effects/feedback box. Why should electric guitars have all the fun?
People do! But you have to sit there and buzz your lips to make a trumpet make sound, but for a guitar you just have to shake the strings. And the sound coming from the amp will do this shaking, completing the feedback loop. So it's mostly portable stringed instruments that get this treatment. There are some violin players that play with feedback effects. I hear Jon Rose is one but I am not familiar with his music. Folks like Jean Luc Ponty and Jerry Goodman make ample use of guitar pedal effects in their violin. And there's a YouTuber out there who plays with them on her harp.
Well,i remember a performance of Jorge Lima Barreto (Portuguese electronic/free jazz) playing with a saxophonist with 2 microphones, one normal and the other with a brutal delay. He would play on the normal microphone and sometimes he directed the instrument output to the delayed microphone and it sounded monumental. Not sure what musician he was, i think is Tomas Stanko, but not sure. The performance sounded like you went through a big storm. :D
> The art of Hendrix's playing, then, is partly in how he harnessed that sound and integrated it into his voice. And of course, he's a force of nature when he does so.
One thing for me to notice is his playing does not require a rhythm guitarist. I discovered that what worked well is Mitch Mitchell as a Jazz drummer his playing was heavily influenced by classics. In a way it complemented Jimi's guitar tone so well.
While I love Mitch's drumming and Noel's bass, can you imagine if Hendrix had worked with Ginger Baker and Jack Bruce - both much more confident and strident players than the Experience's rythym section.
That would have blown the doors off of everything.
I don't think there was another as "out there" guitar player as Jimi until EVH came along - a little more controlled, but just as confident and chaotic. EVH was quite the systems engineer himself (variac, Floyd Rose later on etc)
Miles always impressed me with his ability to pick the best to back him up, and /then/ let them take the front. Some tracks he barely plays on, waiting minutes for his entry.
Jimmy wanted the best to back him up. But I agree with you; I'm just pointing out why I think he didn't.
It's quite likely that when Hendrix went to London the first time, he was the first person ever to play a Stratocaster through a Marshall full stack at full volume.
Also maybe not until the night of his first big gig there.
Townshend had Marshall build 100 watters so he could play louder clean, Clapton had already been cranking it with a Gibson SG which is a characteristic sound all its own, he was in the audience at the gig and was blown away watching Hendrix.
Every year from at least 1964 to 1984, more advanced amps were made than ever existed before.
I think I recall reading about Hendrix that he tried to emulate the sounds of cartoons with his guitar, and then when he was in the army he did the same with trying to reproduce the sounds of fighter jets. Not sure if urban legend, but cool origin story.
I agree but when you’re dealing with celebrities people sometimes lie and exaggerate, and third parties sometimes extrapolate beyond any semblance of grounded facts. So most people subject to that level of scrutiny and fame are likely to have some allegations against them whether true or not.
Hendrix’s girlfriend Kathy Etchingham claims he never abused her. Some third parties dispute her claims about her relationship.
His arrest record suggests at least some type of altercation with a previous girlfriend but it’s far from clear cut to me.
People are complex and reality is complex. I myself was subject to false accusations about abuse from a disgruntled ex girlfriend (who actually WAS in fact physically and mentally abusive to me and I have the scars to prove it).
But regardless, I have zero issues reflecting on a person’s accomplishments and talents even in the context of them being a horrible person. In fact, I find that part of the intrigue of really talented people. Reality and people are quite multi-dimensional. The only general rule I know is that nobody is perfect and holding up ANYone as some example of moral perfection is almost certainly wrong.
Part of what makes Hendrix's live performances so great is how completely unreproducible they are. Even Jimi himself could never recreate that one note sustain when he begins the solo on Machine Gun. To re-create it, you'd have to set the room up exactly the same, tune the guitar exactly the same, position the guitar relative to amps exactly the same, etc. So Hendrix being very sensitive and connected to the room was able to harness that energy into something unique that stands the test of time. Machine Gun is well known, but his Red House performance at Randall's Island also stands out to me as exceptional, those are the 2 key Hendrix performances. I read somewhere that Miles Davis was really impressed by Machine Gun and you can see why.
One thing I learned after buying some gear at home to try to record electric guitar at low volume is how important the physics of the speakers are. You can plug a tube amp into a cabinet simulator and you'll lose a lot more than using solid state electronics on a good but not great Fender amp, especially if you use fuzz / distortion pedals.
I'm not sure Hendrix was a systems engineer, but he was a transcendent blues artist, that's for sure.
> Electric guitars attack hard, decay fast, and don’t sustain like bowed strings or organs.
Since the 1980s, we have had the "Sustainiac": an active circuit installed in the electric guitar along with a "reverse pickup" which is energized in order to excite vibration in the strings.
With this device, at the flip of a switch, you get indefinite sustain on any note on the neck, at any volume, distortion or not --- even if the electric guitar is not plugged into an amplifier at all, and just heard acoustically.
The best implementations of this have a three way harmonic switch. You can choose between excite the fretted (or open) note itself (fundamenta a.k.a first harmonic), an octave above it (second harmonic) or a higher harmonic still.
You can be sustaning the given note, and then at the flip of a switch, it will fade over to the higher harmonic.
YouTube videos of this in action are worth checking out.
If you don't want to or can't install a Sustainiac pickup, you can get a much cheaper handheld one-string "E-Bow" that does the same thing. It's not as easy to use as a Sustainiac and you can't also be playing with the whammy bar unlike with a Sustainiac, but you can get it to do tricks a Sustainiac can't do: see the "spiccato" section in https://www.youtube.com/watch?v=b0V3pzxma-8
I've also managed to make an E-Bow work with a steel-string acoustic guitar (but only on one string IIRC).
You might enjoy this video. He really goes deep into using the guitar to create textures and emotions. He talks about the Edge (U2) and his Infinite Guitar and that he actually calling Michael Brook to see if he could get one. Eventually Fender did a custom build on his Clapton Strat which became the Fender EOB Sustainer.
Hendrix and Mayer created a great sound, but I've always thought the most incredible thing about Jimi Hendrix was: he only played the guitar about 11 years. TOTAL. He picked it up around age 15, and died age 27.
That's wild to think about, I've been playing the guitar longer than that yet his are heights I'm unlikely to reach. He was such an innovative guitarist.
Interesting factoid: modern guitar effects typically have their input jacks on the right-hand side, and output jacks on the left. In this article's guitar rig diagram, the jacks are reversed, but this is accurate: back then, for whatever reason the jacks were reversed on each of these pedals. Modern reissues of the round-enclosure Fuzz Face pedals preserve this pattern despite the reversal of industry trends.
the ergonomic advantage of left-to-right is that most players use right-handed guitars, so the guitar's cord comes out the right side of your body, and it's most ergonomic for it to be directed straight away from you to the right side of your pedal board, not criss-crossing in front of you towards the left side of your board.
>criss-crossing in front of you towards the left side of your board.
No need to criss-cross the cable in front of you, you can connect the cable to the guitar on the right side and the cable will go behind you and emerge on the left side, into the pedal board/fx processor.
With your cable in your right hand, it is easier to plug into the right side of a pedal. If you were to try to do the equivalent with your left, the guitar neck would be a little bit in the way as well.
I strongly believe that if you set aside genre preferences the solid body electric guitar coupled to a tube amplifier is objectively the greatest electronic instrument ever created.
All other electronic instruments, with the one exception being the Theramin, have a fundamental problem with human expression. There is an unsolvable disconnect between what the performer's actions and their audience.
With an electric guitar you get the physicality and dynamism of an acoustic instrument with the complex timbres and extended technique possibilities of an electric/electronic instrument.
There are complex and musically significant feedback loops occurring across many dimensions that lead to extremely complex transformations of timbre via both traditional music theoretical techniques and the physics of a tube amplifier combined with an inductive load (the guitar pickup).
Its really crazy how much more dynamic and complex this can be then even a highly sophisticated modular synthesizer or whatever. Even the way you over load the power supply in a tube amplifier can be manipulated on the fly to enhance and transform timbre.
Then on top of all that it is so incredibly physical that a performer like Jimi Hendrix can manipulate these systems and have the audience intuitively understand what he is doing. Never in a million years would THAT be possible with any other electronic instrument.
The reverse example of this is musicians who play techno with analog instruments, like Pipe Guy, Basstong, and Meute[0][1][2].
There are always some people who get extremely defensive whenever I say that techno didn't click for me until I heard this kind of "techlow" music. Specifically about the part where I think that the reason is also a human expression problem, because of limitations imposed by the electronic media used.
EDIT: having said that, I don't think I would agree with your premise, because it is colored by a subtle form of survivor bias. None of us remember what it's like to not know electronic guitars or what they sound like, so claiming "the audience intuitively understands what Jimmy Hendrix is doing" is like saying everyone "intuitively understands" their native language. On top of that there's nothing about the workings of an electronic guitar that wouldn't in principle work for something like an electronic violin or whatever.
The whole thing about people being defensive is interesting. I love techno, but anyone who has learned other styles of music recognizes the repetitiveness and quirks of a lot of techno and some other electronic genres.
They do a great job with changing their timbre and tones but often ignore a bunch of other factors that make music interesting. Whether that is the rarity of time signatures other than 4/4, the way certain rhythms are locked into certain genres, the choices of keys used, the limited or missing chords, etc.. at some point you start hearing two electronic songs that sound totally different at a superficial level and you realize they're incredibly derivative of each other.
You might also enjoy Beardyman, if you haven't run across him yet. Does techno and other genres with nothing but his own voice and a shedload of ipads: https://www.youtube.com/watch?v=DYVUlx7BhhI
Nathan Flutebox Lee and Beardyman @ Google, London [1] is one of my favs. At the time it was available on 'Google Video' before they acquired YouTube. So I don't have a link to the orig. post. SPOILER: especially that theme with the Godfather when he says Google is just epic and balls.
> There are always some people who get extremely defensive whenever I say that techno didn't click for me until I heard this kind of "techlow" music. Specifically about the part where I think that the reason is also a human expression problem, because of limitations imposed by the electronic media used.
I guess the part people don't like hearing is the implication techno is somehow not expressive. I'm not sure that it lacks expressiveness, but it is certainly more "controlled" than traditional music. When I first heard techno as a teenager in the 90s, my mind was blown. I remember exactly where I was the first time I heard Underworld [1], Photek [2], and Autechre [3]. I think I was attracted to these sounds _because_ they were so different. I think it's hard for electronic music fans like myself to accept the idea that it isn't expressive _because_ it is so different. Isn't it just a different kind of expression?
Still, people like what they like. I'm glad you found a version of dance music that works for you. I've long since moved on being judgmental about people's musical tastes. I think it's just wonderful that music exists at all!
> I guess the part people don't like hearing is the implication techno is somehow not expressive.
I think of it more like a painter's palette: every instrument and tool involved in creating music has a different set of colors to choose from, and can also filter some "colors" out if we think of things like audio processing filters.
The tools and techniques typically used to produce techno filter out "colors" that feel essential to me to connect with a song, and yeah, that "controlled" aspect of it is probably a large part of that. That doesn't mean it's not expressive, it's just expressive in a way that I struggle to connect with.
EDIT: funny enough I actually have protanomaly, so my choice of analogy is slightly ironic there. Some visual art and design out there objectively looks terrible from my subjective experience, since the colors look completely off. But that doesn't mean I'm saying the art is objectively bad.
Legends Never Die - Leagueoflegends + Ethnic Instruments by Belle Sisoski [1]. And no, I've never played LoL, I probably never will, and I haven't seen that series based on it (Arcana or something?) either.
Also, I haven't checked what Juno Reactor do these days, but their old work is phantastic. My fav show of them is Juno Reactor – Shango Tour 2001 Tokyo [2].
For electric violin, I love Ed Alleyne-Johnson [3]. Never seen him live (I'm not from UK) but I own a couple of his earlier works. It reminds me of that time when my dad was in his final years of his lives, and when he finally passed away. Makes me cry every time.
Nice addition! First time I heard of them and I'm liking what I'm hearing so far.
And just to clarify: I don't dislike electronic instruments. I just think that on some subconscious level the human brain can detect other humans playing a live instrument. Like there's something "embodied" in the sound that is likely missing from a pure electronic instrument. And I needed that element to "unlock" access to techno.
Yep, there's a reason we have the industry term "humanization" in sound design, composition and arrangement.
Tons of work has been done on various modes of humanization by trying to parameterize and modulate these aspects over time. Timing accuracy, velocity variance, chance, etc.
A well-played instrument certainly feels like someone speaking and expressing themselves to you. There are attempts to capture this with MPE instruments such as the Osmose, or Imogen Heap's MiMU gloves.
> have a fundamental problem with human expression.
How up to date is this opinion of yours? Expression on guitar is pretty intuitive, but modern electronic instrument manufacturers have been working on this problem and created modes of expression that definitely solve this problem.
For example, EWIs allow you to use breath control for expression with many of the same techniques available on actual wind instruments. Also many synths now have features like polyphonic aftertouch, pitch/mod wheels, which allow you to add expression to a note while it is playing. Apps and hardware exist which allow you to use novel methods of capturing motion or other forms of expression. And most modern synths/midi controllers allow you to decide what parameters are affected.
> Then on top of all that it is so incredibly physical
That's an affectation. I can stand on my tiptoes and close my eyes when bending up a note on the synth the same as I can on the guitar. Neither affects the sound, and both are a conscious decision to project an appearance of "I'm really shredding"
> With an electric guitar you get the physicality and dynamism of an acoustic instrument with the complex timbres and extended technique possibilities of an electric/electronic instrument.
That can apply to any instrument once you "electrify" it. What makes a guitar more expressive than a cello or trumpet with a pickup/mic running through effect processing? I play guitar, keys and trumpet, and while I agree that a casio keyboard has limited expression options, your opinion doesn't sound researched.
Hmmm, I disagree, having played electric and acoustic guitars for over two decades and begun learning piano and synths for the first time in 2025.
For one, you can’t easily play two melodies simultaneously across several octaves, using both of your hands, with an electric guitar.
Stringed electronic instruments do have their advantages, but so do the others. Each music making thing has its place in the spectrum.
Two books that have helped me greatly in my musical life, in case people haven’t heard of them, are The Listening Book, and Bridge of Waves, by W.A. Mathieu.
You can play with both hands on a Chapman stick, right hand can do the bass, the left the melody/chords or vice-versa (Chapman stick is played tapping the strings with both hands)
There are certainly guitarists who can play simultaneous melodies.
If you're limiting to a 6 string guitar the distance between the two melodies would be limited compared to a piano but guitars don't have to be limited to 6 strings.
Classical guitar is full of this kind of thing.
Having taken piano lessons but being more into guitar I think the thing is almost all people who play piano are introduced to this and it is a core concept in far more piano music than guitar music. But it is not impossible on guitar, and many works for piano that get adapted to guitar require the player to do so.
E.x. there are plenty of players who have studied and played the Well Tempered Clavier on guitar.
Guitars certainly have a more intimate connection between the touch of fingers and the sound, including the bending of the tone, one of Hendrix’s virtuosities.
Keyboards can approach that with polyphonic touch keys like the Hydrasynth (lean into keys, pressing them harder, for bending the tone in a configured patch), sustain pedals, and pitch bend/modulation controls, but not the nuanced touch of skin on a vibrating string.
I think synth guitars exist, too, but don’t know anything about them. The pedalboards are enough, maybe :)
> There is an unsolvable disconnect between what the performer's actions and their audience
Is that really true though? If I watch a cellist play I can pretty clearly see all the things they are doing and it will correlate neatly to the timbre of the sound.
Secondly I think it's important to note the tube amp and the guitar are seperable, and I don't think that their connection is particularly magical. I can reamp a sound from my synthesizer (or maybe a keytar?) into a guitar chain, and if I manipulate the mic and other controls in the same way I might manipulate the pickup, I can also get all manner of interesting feedback effects. My inputs will have different harmonic characteristics of course, and the tube amp's effects are mostly transformations of harmonics; you'll still get some cool tones and they will be subject to a lot of the same rules as if a guitar was being played.
They're talking about electronic instruments there. The comment is about how electronic instruments don't generally match the physical expressiveness of acoustic instruments (like the Cello).
I'm talking about electronic instruments how they are deficient in expressiveness compared to your cello example.
> Secondly I think it's important to note the tube amp and the guitar are seperable, and I don't think that their connection is particularly magical. I can reamp a sound from my synthesizer (or maybe a keytar?) into a guitar chain, and if I manipulate the mic and other controls in the same way I might manipulate the pickup, I can also get all manner of interesting feedback effects.
The story is not quite so simple. Your synthesizer is going to have a buffered output so it wont have the complex impedance loading interactions with the amplifier as the guitar pickup.
This is actually critical to how early distortion effects such as the classic Fuzzface work and imo is essential for the kind of complex timbres you can produce with a guitar + tube amp.
In fact you can take an electric guitar, put a buffer pedal in the chain between your fuzz pedal and amp and completely destroy the ability to produce wild feedback and distortion.
I'm a guitarist, but there's nothing particularly magical about a high impedance signal, other than they tend to lead to noise and make really obnoxious things matter, like how low capacitance your cable is. Also, a TON of modern guitars are low(ish) impedance out because they use active pickups.
The pedals and system being dependent on the high impedance was always a bug, not a feature, and make the setup incredibly dependent on variables that really wouldn't be that hard to just buffer then recreate deterministically. Like, if your pedal should react to that impedance just buffer the front, put a big inductor (or a transformer using only half, or, - and I've actually seen this - just a whole guitar pickup) in the pedal. Then you're not dependent on the pickups of the guitar or the capacitance of cable or anything else and you can make sure the effect sounds good regardless of pickup type.
That is going to be something like a transformer to step down your line level signal and some series resistance to match the load to help drive the amp.
An actual coil pickup has reactive impedance that is frequency dependent and will result in a more complex interaction between the devices.
> The pedals and system being dependent on the high impedance was always a bug, not a feature
Sure if you think like an engineer, but everything you are complaining about is what allows someone like Jimi Hendrix to do what he did with a guitar.
they're comparing an electric guitar to electronic instruments, like midi keyboards. An electric cello would be the same thing as an electric guitar in this context.
No two trumpet players sound the same. I know who is playing just by the tone. Listen to Herb Alpert / Al Hirt / Maurice Andre, all playing the same instrument, but wildly different.
There have been some interesting keyboard input devices coming out which allow for more expression than normal piano keys, using a sort of hack to the MIDI system called MPE - MIDI Polyphonic Expression. For example the Seaboard Rise or the Osmose. Depending on the instrument it's possible to do per-note pitch bends, change pressure while holding notes, perform vibrato etc. Visually the physical movement is not as interesting as electric guitar though, so yours probably still wins.
Great argument -- but I'd also counter that "the turntable" (i.e. in the hands of experts like Q-Bert, Craze, Rob Swift, Jazzy Jeff and others) fits this quite well -- especially re your "have the audience understand what he is doing argument"
Haha that is a great highly expressive counter example! However, as far as versatility of sound I still think the guitar+tube amp wins as you have access to all of western music theory and techniques as its still a traditional string instrument.
Similar to the Theremin is the ondes Martenot. Jonny Greenwood (Radiohead) describes it as a "very accurate Theremin".
You can hear it particularly on "Where I End and You Begin" from Hail to the Thief. Ed O'Brien compliments its sound using an EBow (back before he had the sustainer) in that song.
> All other electronic instruments, with the one exception being the Theramin, have a fundamental problem with human expression. There is an unsolvable disconnect between what the performer's actions and their audience.
Electric bass? Heck, even in synthesizers, you have the EWI or the Haken Continuum.
Guitar (and bass) are obviously and far and away the most successful, but it does a disservice to a number of wonderful inventions to say they're the only ones. Just look at what the Japanese band T-SQUARE does with the EWI to see people innovating at the edges.
This comment is a love letter to electric guitar. I adore it. Consider reading “Desolation Road” by Ian McDonald. I don’t want to spoil any of it, and perhaps science fiction isn’t your cup of tea, but at one point there is a character on Mars with a 700-year-old strat, and you can tell Ian McDonald loves the guitar as much as you do.
I feel like the synthesizer--CMI Fairlight, Moog anything, Synclavier, PPG Wave, and just the general concept of modular synthesis--are pretty staunch competitors. Yours is certainly a fun and fair take, and arguably the electric guitar+tube amps birthed so many genres (blues, soul, funk, rock, punk, metal, etc) where as synthesizers remained pretty niche with their contribution to experimental music and pop music, mixing in with rock funk and disco, and the titan of EDM that grew out of that.
You could argue that it's one of the most versatile instruments, sure. "Greatest" is completely subjective.
But is it one of the most versatile instruments? You can do signal transforms with any kind of audio input, although it's done more with the electric guitar than any other instruments.
I would say it in practice, it has the most versatile sonic profile.
A modular synth is more versatile in terms of enumerated signal transformations. Its the ability to be expressive with those signal transformations that makes the guitar+tube amp what it is.
With the right interface, I think the synth can be more expressive. Look at the Haken Continuum or ExpressiveE Osmose - both can be used with something like the Expert Sleepers FH-2 to get MPE data to the modular.
I do see your point, and agree the amount of articulation you can do with guitar is hard to beat, but I do think a synth can win, if the setup is built for it.
Synths with mod wheels are the bomb, I used to have a roland that had a pitch wheel for bends and then push it for tremolos, vibratos and such, and way more voices, envelopes etc and that was a few decades ago and I'm sure that nowadays guitars are not going to compete except at one thing, making guitar sounding noises, you can get guitary sounds but somehow they come off to me to be too clean and lack the slop that various fingerings produce lol
I watched Wayne Coyne of the Flaming Lips do something similar with some kind of "I don't know what" controller, it was some kind of input in his microphone stand. As he moved it around, the sound and projection changed.
I remembered learning about similar MIDI controllers when I was in school.
Imogen Heap created a set of gloves that transform finger flexing and wrist movement into midi signals you can use in whatever way your performance software allows.
I generally reserve the word electronic to mean something with a microcontroller or discreet logic components. Electronic guitars exist, but they're basically differently shaped keyboards.
I often lament the lack of other electric instruments.
I have come around to the idea of guitars being electronic instruments. Strings are the original oscillators. Once they become electrical signals it isn't clear to me how they differ categorically from any other electric instrument. There are an almost infinite number of pedals, many of which offer things like filters, LFOs, and other synthesis stalwarts. You could even make the guitar a controller for more traditional synthesis work.
>"All other electronic instruments, with the one exception being the Theramin, have a fundamental problem with human expression. There is an unsolvable disconnect between what the performer's actions and their audience."
Look at Roli Seaboard, it has insane amount degrees of freedom / expression
Synth music elevated electric bound tones to anything ever heard.
I remidn you that most of the rock and roll and rock music was about speed and mimicking the sound of a rumbling car engine, as it was a symbol of the freedom in America, being able to run away from your toxic communities to find yourself better anywhere else.
That was the message for the young with rock and roll: a speedy engine for your ears.
Electronic music was like replacing a car with UFO evoking you a space travel.
With the progressive subgenre of techno music you got the same feeling, but with no subtle hints. Heck, one of the most known songs in Spain ever, "Flying Free", literally remixes the sounds of drifting cars between the melodies, making the listener really happy in a very direct way as tons of youngs in the 90's got into the outskirt night clubs... by car. So they felt as driving an infinite highway rave with no end for days.
The amusing thing (to me at least) is that while the DX7 gave users almost infinite options as to how they could create and shape sounds, if you know what to listen for you'll hear the E PIANO 1 and BASS 1 presets an about half of all mid 80s hits. Turns out when they gave musicians a tool with immense flexibility, many of them still chose to use two of the (admittedly great) preset sounds.
Yup. FM Synthesis is challenging enough to implement, but doing so on the DX7's interface is a whole other level of frustrating. It's far from the hands-on interfaces of most subtractive or modular synthesizers.
Apparently this happens every time. A sample disk included with tracker software was used in hundreds if not thousands of modules and pretty much defined the sound of the Amiga.
The DX-7 FM synthesis opened the door to a pretty narrow but interesting range of sounds, bells and brass, which people loved and it was a ripsnorting success for a time, but it didn't displace subtractive analog synths and people aren't exactly playing FM synthesizers any more, while they are now heavily back into analog subtractive. of course there are also romplers and samplers etc. and those can achieve sounds that FM did, but it's hard to call the DX-7 any type of be-all end-all.
> I don't think Hendrix was on a 'mission' to solve engineering puzzles at all. He was just experimenting, as an artist.
1,000,000%. Guitar is one of those hobbies where people mythologize and build elaborate hagiographies around players they like and the gear that they used. Hendrix was a generational talent but I highly doubt he was sitting around enumerating problem statements and systematically exploring solution spaces. The Fuzz Face was one of like four dirtboxes available during that time so he chose that one. He flipped a guitar upside down because he could source one more easily than a lefty model. He leveraged feedback because he discovered it naturally and realized that he could make it sound totally badass.
The man clearly had a vision and executed it but his decisions were pragmatic, not the product of grand technical reasoning. It reminds me of the student who wrote a bunch of authors and asked to what degree they were conscious of the themes and symbolism in their work [1]. Many were not - as it turns out English teachers often put the cart before the horse. This is the rock and roll version of that.
I can't knock the article though as it has a lot of sound (pun intended) analysis in it as opposed to typical guitar forum dreck about NOS tube and hand-wired turret board magic.
> He flipped a guitar upside down because he could source one more easily than a lefty model.
I've read that he claimed he played a right-handed guitar upside down because his father was superstitious and didn't like him doing things left-handed, so he'd play a right-handed guitar upside down most of the time and flip it over when he needed to play in front of his father. (I'm not sure why he didn't play a lefty guitar upside if that was the case, but I could imagine that the availability might be relevant like you mentioned, or maybe his father was familiar enough with guitars to be able to recognize a left-handed one and figure out what was going on, or maybe because he was better left-handed he could play it upside-down well enough but due to not being right-handed he would have found it more difficult to play it in the non-standard way).
A compressor pedal was the best thing I bought when trying to imitate Mark Knopfler doing Sultans of Swing. Not only does it sound good but a fender with a compressor is also much easier to play than an acoustic guitar.
Hi! I work at IEEE Spectrum and there's no way an LLM wrote this. We have a pretty strict Generative AI use policy (bottom of this page https://spectrum.ieee.org/about). I'm guessing this is from writers using actual writing techniques that Gen AI stole from...
I took my introductory college writing classes at a college I can’t name-drop without sounding like a jerk, which also did a bunch of LLM research over the years. We used a TON of em dashes in our writing. It’s no mystery, to me, where that stylistically prevalent quirk comes from. I’ve definitely been accused of being an LLM bot.
I was speaking with my 14 year old nephew via messaging last month. It was about a deep topic, synthetic consciousness. He wrote such an intelligent reply that I asked him: hey, was this from an LLM? He was insulted. I did research with his parents and found out that 90% no, he's just a very smart kid.
Is there a name for this this mode of confusion yet?
Indeed. Along the same lines, the recent "humans parading as agents" on moltbook story made me think... what is the inverse of the goal of captcha? That's impossible practically, right?
He was insulted on two counts. Firstly doubting his intelligence. Secondly the insinuation of deception: "You aren't that smart, surely you cheated."
The insulting didn't end there. You asked his parents! Even then you only landed at 90%, yet another insult because why can't he earn 100%? Ethical dilemmas on all sides!
Yes, suspected LLM is mimicry passing under the turning test,encheapened by poor editing, further encheapened by even poorer training. It is going to get both worse,and better at the same time. (I grew up with ELIZA.BAS, and used to easily spot fakes, and it is getting both easier and harder. ) I detest the words of LLM, but not the M. It's a statistical model. It is a very large language statistical model - that is constantly fooling slower hoomins.
That policy doesn't explicitly disallow writers from using LLMs as part of their process, nor does it mention reviewing submissions for content that could be LLM-generated.
I like some of the ideas in the article but there are some very "it wasn't just A, it was B" sentences in there. IEEE has a higher standard.
It's because there's clearly a near-1:1 ratio of input to output. I also noticed some LLMisms, and I suspect the author may have ran the text (perhaps in the form of a large number of bullet points) through an LLM. But because he's using the LLM to clean instead of multiply, it's still worth reading.
Probably similar to what I do with my papers and resumes, I write them myself then throw them through LLMs for suggestions and corrections, manually reviewing the output.
LLM-isms are tolerably bad. LLM's narrative ability is intolerably terrible. As others said, because a human actually wrote the overall narration for this, it was still compelling to read. The mistake would be skipping a well-narrated and thoughtful article just because of a few bad LLMisms.
I think LLM's lack of "theory of mind" leads to them severely underperforming on narration and humor.
Ironic because IEEE Spectrum has an anti-LLM policy. So your complex about LLM writing styles has indirectly caused you to stop supporting genuine prose.
Seriously there's no LLM stuff in here. Only emdashed which were used in journalism decades before AI was even a thing.
I feel for you, because moving forward more and more interesting and substantious articles will be written with llm-isms, either because LLM was used directly in writing or because the authors absorbed the style.
The way this article was written, is the standard way these kind of US pop science articles have always been written. It's LLM that absorbed that, not the opposite.
The amazing thing is that Hendrix during live performances has the same wonderful effects as he got in the studio. I only saw Hendrix play live one time, that was in San Diego a few weeks before he died in England.
I love how nonchalantly you threw this one in. I am proper jealous, how was it?
On your first remark, I agree. This is why I love Dire Straits and Mark Knopfler. The studio recordings are amazing, and then you listen to their live stuff and it's even better.
Interestingly, to me Hendrix and Knopfler feel kinda like creative opposites. Knopfler plays much cleaner and with much more variety, he has written many songs that are catchy in different ways (which is almost impossible for any musician), he basically achieves every goal that a beginner musician could choose to chase. And yet his stuff feels like a creative dead end, a dreary road leading to adult contemporary. While Hendrix has no songs to speak of and only one sloppy screechy sound, but it's the sound that launches a thousand bands and feels inspiring even now. Maybe the lesson we're supposed to learn is that we shouldn't choose what goal to chase, we should just feel it.
Hendrix seemed like a really nice guy. His vibe was that he loved everyone in the audience. I was close to the stage and took some nice pictures using Anschrome 500 that I developed myself pushing it to 1000 ASA. Treasure those pictures.
I like how the phone rings in the background on Gypsy Eyes. Wonder who called?
Voodoo Chile lyrics: "on the night I was born the moon turned fire red".
Poetic license? Stellarium reveals on the early evening of November 27, 1942 in Seattle, the moon was low on the horizon - just 25 degrees altitude at 5:30pm, directly East. The sun set at 5pm. While not a full moon it was 85%, so I'm calling it! The moon may have glowed a warm orange-red on the night (of the day) Hendrix was born.
I've often marveled at the success many guitar players had with experimental electronics - Hendrix, EVH, Les Paul, Brian May, Jack White, and Tom Scholz (special case, of course) are just a few examples.
The podcast "History of Rock in 500 Songs" (full disclosure: I am a devout, slavering fan) provides these on the regular. I was actually smiling when I heard a fairly new song that attempts a really flat, fuzzed out sound because it made me think, "Buddy Holly invented that by accident with a broken speaker". One of the episodes on The Who goes into the Marshall behind Marshall amps in similar detail.
I suppose if I were going to recommend a single episode to Hacker News though, it would be https://500songs.com/podcast/episode-146-good-vibrations-by-... which begins with at least a half hour on the amazing (if not happy) life of the guy who invented the Theremin, Lev Sergeyevich Termen.
It was even crazier than that, it was four Fender Twin Reverbs!
Considering the amount of watts involved in this, it probably qualifies as rocket science haha
JM: Yeah. The tremolo sound from the intro? That was four Fender Twin Reverbs. Myself controlling the speed of two of them and the producer controlling the speed of the other two. So two amps were recorded on one side of the stereo and the other two on the other side. I recorded the part on the tape without the tremolo, and then I sent the part from tape out to four amps, and he controlled two, and I controlled the other two.
And it took a long time because inevitably the tremolo would go out of time with the track because the tremolo doesn't stay in regular clock time. Also we would go out with each other's amps, so we had to keep looking up at each other after every fifteen second bursts and kind of fess up, "Oh yeah, mine kind of went out of time." It took long time, but I'm glad we did it that way because if we had cut and pasted two seconds of audio, it wouldn't have had the same dynamic quality throughout the six minutes of the song, or however long it is.
This is why I feel the recentish (last 10-15 years) shift in decoupling CS curricula from EE and CE fundamentals in the US is doing a massive disservice to newer students entering the industry.
DSP, Control Engineering, Circuit Design, understanding pipelining and caching, and other fundamentals are important for people to understand higher levels of the abstraction layers (eg. much of deep learning is built on top of Optimization Theory principles which are introduced in a DSP class).
The value of Computer Science isn't the ability to whiteboard a Leetcode hard question or glue together PyTorch commands - it's the ability to reason across multiple abstraction layers.
And newer grads are significantly deskilled due to these curriculum changes. If I as a VC know more about Nagle's Algorithm (hi Animats!) than some of the potential technical founders for network security or MLOps companies, we are in trouble.
I came into a CS and math background without CE or EE, and took two dedicated optimization courses (one happened to be in a EE department, but had no EE prereqs), as well as the optimization introduced in machine learning classes. To be honest a lot of the older school optimization is barely even useful, second-order methods are a bit passe for large scale ML, largely because they don't work, not because people aren't aware (Adam and Muon can be seen as approximations to second-order methods, though, so it is useful to be aware of that structure).
Isn't Nagle usually introduced in a networking class typically taken by CS (non-CE/EE) undergrads?
Just because EEs are exposed to some mathematical concepts during their training doesn't mean that non-EEs are not exposed through a different path.
> Isn't Nagle usually introduced in a networking class typically taken by CS (non-CE/EE) undergrads
Networking, OS, and Distributed Systems is increasingly treated as CompE or even EE nowadays in the US.
> Just because EEs are exposed...
That's the thing - I truly do not believe that EE and CS should be decoupled, and I believe ECE as a stopgap is doing a disservice to the talent pipeline we need for my verticals to remain in the US, especially when comparing target American CS and EECS programs to peer CEE, Indian, and Israeli CS programs [0].
There is no reason that a CS major should not be required to take a summary circuits, DSP, computer architecture, and OS fundamentals course when this is the norm in most CS programs abroad. Additionally, I do not see any reason for EEs and ECEs to not take Algorithms, Data Structures, and Compilers as well.
> Just because EEs are exposed to some mathematical concepts during their training doesn't mean that non-EEs are not exposed through a different path
Mind you, I'm primarily in Cybersecurity, AI/ML infra, DefenseTech, and DeepTech adjacent spaces - basically, anything aligned with the "American Dynamism" or Cyberstarts thesis.
From what I've seen, the most successful founders are those who are able to adeptly reason and problem solve, but are also able to communicate to technical buyers because you are selling a technical product where those people make the decision.
Just because an approach isn't useful today doesn't necessarily imply it isn't in the future and being exposed to those kinds of knowledge and foundational principles makes it easier for one to evaluate and reason through problem spaces that are similar but not necessarily the same - for example, going to the Nagle's example - this was a bog standard networking concept that has now become critical in foundation model training because interconnect performance is a critical problem which can impact margins.
A lot of foundational knowledge is useful no matter what, and is why we fund founders and hire talent at competitive salaries.
Muon is much more sophisticated than Newton's method. Neural networks have started to borrow techniques from statistical mechanics, and various branches of maths like invariant theory that were previously rarely used in engineering. CS is not dumbing down; its needs and focus are changing.
I've never needed or benefited from most of the EE curriculum. There is an opportunity cost in learning things you don't need.
I guess it depends on where you went. I was a CS student at Virginia Tech in the late 90s. The CS department wasn't even in the engineering school. We did have to take computer architechture which was the only courses other than math/physics we had in common with EE/CE
I know at MIT it was (and I think still is) one major - EECS, and students had substantial latitude on how much they wanted to concentrate into hardware or software at least after the intro courses.
I graduated in 2020 and I took a circuit design class and was taught Nagles algorithm. I guess I could have learned more but I thought the degree was packed enough with enough when you consider all the different parts of it, from the math to systems programming to ML stuff.
For a better example of the guitar as a synth like feedback device listen to robert fripps solo on heroes by bowie. He used markings on the studio floor to achieve the desired tone.
No he wasn't. Sorry, but being a systems engineer will never be as cool as being a revolutionary musical genius. On the plus side you probably get to live 2-3 times longer.
That's stretching the term to the breaking point, for me. Is there some evidence of systematic analysis of component parts? attempts to model elements of the problem? data gathering and data analysis? simulation? Intentional application of principles of physics or some other pure domain to a real world problem?
Artistic endeavors come from lots of places, not just people with an analytical mindset. Historically those two are seen as opposing tendencies, which I think is unfair, but it points to the importance of intuition and navigating perception and emotion for artists.
Crazy example of when everything is AI generated, even the code referenced in git repo (refer to commit 3d733ca), and actually interesting and "new" in a way...
I’m curious because to tell you the truth the novelty struck me as similar to comparisons I’ve toyed with using LLMs on my own. The AI-generated logic between comparing two dissimilar things is too sterile for my liking.
I understand that this is appearing in technical publication, but for some reason that invites even further scrutiny on my behalf.
Art and engineering are both constrained optimization problems - at their core, both involve transforming a loosely defined aesthetic desire into a repeatable methodology!
And if we can call ourselves software engineers, where our day-to-day (mostly) involves less calculus and more creative interpretation of loose ideas, in the context of a corpus of historical texts that we literally call "libraries" - are we not artists and art historians?
We're far closer to Jimi than Roger, in many ways. Pots and kettles :)
That's great, but it doesn't make you or any of us engineers.
Just because I drive my car with immense focus, make precision shifts, and hit the apex of all of my turns when getting onto and off of the freeway doesn't make me a race car driver.
Engineers don't just feel good vibes about science and mix it into their work. It is the core of their work.
Simply having a methodology absolutely is not sufficient for being an engineer.
And great, you have an arbitrary system of ethics, like everyone does I imagine. But no one holds you to these ethics.
This is a terrible article. In the first subplot, there is no explanation of what v(b1) and v(c2) are. The -8 on the on y axis (amplitude) looks like an upside down 8.
Further down there is a sentence: "First, the Fuzz Face is a two-transistor feedback amplifier that turns a gentle sinusoid signal into an almost binary “fuzzy” output." But the figure does not match this - there is no "gentle sinusoid" wave shown on the first fuzz face plot.
Hendrix reportedly discovered feedback by walking away from a cranked amp. The guitar just kept sustaining on its own. What followed was years of empirical system identification: learning how body position, pickup selection, and guitar-to-amp distance affected feedback character. No transfer function, just iteration. That's a valid engineering methodology.
Any guitarist in a 1940s big band would have a big hollowbody guitar and an amp. That combination is incredibly prone to feedback. Everyone worked to reduce feedback and avoid it. That's what I do with my hollowbody when I play with a big band. It's the first thing that happens when you turn up.
Hendrix did not "discover" feedback, and in fact he did not discover the musical uses of feedback - you can hear it in BB King records that predate Hendrix, where feedback makes his notes "sing."
What Hendrix did was turn feedback into an intentional musical creation that he treated as a melodic voice.
In a sluggish economy
Inflation, recession
Hits the land of the free
Standing in unemployment lines
Blame the government for hard time
We just get by
However we can
We all gotta duck
When the shit hits the fan
One of the great things about a hi-gain setup like Hendrix's is how the feedback loop will inject an element of controlled chaos into the sound. It allows for emergent fluctuations in timbre that Hendrix can wrangle, but never fully control. It's the squealing, chaotic element in something like his 'Star Spangled Banner'. It's a positive feedback loop that can run away from the player and create all kinds of unexpected elements.
The art of Hendrix's playing, then, is partly in how he harnessed that sound and integrated it into his voice. And of course, he's a force of nature when he does so.
A great place to hear artful feedback would be the intro to Prince's 'Computer Blue'. It's the squealing "birdsong" at the beginning and ending of the record. You can hear it particularly well if you search for 'Computer Blue - Hallway Speech Version' with the extended intro.
You know, I've heard that performance so many times over so many decades that I don't have to hit a play button or even close my eyes in order to hear it. It's there inside my head when I want it to be.
And somehow I never interpreted it in that way (sirens, screaming, etc) until just a moment ago. I thought it was just a quirky little early-morning break in the familiar tune from someone who had been up way too long by that point.
And now instead of just being the quirky sounds of an impromptu guitar solo that I can recall whenever I wish, it now has unpleasant pictures to go with it.
Thanks (I think).
I thought it was sheer genius that Hendrix was able to subtly bring that into the national anthem which made it resonate so well with those purchasing his music. But without that background reference I never supposed that younger generations would hear it entirely differently.
Slightly off-topic--
Before my time, but my professor* recalled to our class his experience watching a _live_ news report from Vietnam. Something shocking happened during the broadcast. As a visual-media scholar he contacted the station to obtain a copy. No go. He remarked how he never saw that footage ever again (at that time it would have been over 15 years ago). In our modern digital age it's difficult to imagine anything going live to the nation, and then disappearing.
* (Charles Chess, Introduction to Film, SJSU, c1992)
The Epstein files would like a chat with you.
As would "flood the zone".
There was one scene where Rich Burton and Elizabeth Taylor were arguing with each other. I watched their lips move, and somehow I heard Burton speaking his lines in his voice, and Taylor her lines in her voice. I had to do a double take to see that the sound was actually muted, but my mind re-created it anyway.
BrainOS 1.1> Optimize Memory (Y/N) __
I can tell when a musician is lip syncing their hit song, because nobody sings a song the same way twice, and the performance exactly matched the CD version of the song.
Instead, I can recall the complete works of Roger Waters or Nine Inch Nails, but not the names of the songs unless I really studied that part. I can recall themes from TV shows from decades ago, but be unable to place the name of the show.
At any given time, anywhere at all, I can listen to any of at least five different covers of Fat Bottomed Girls -- and have no idea who performed any of them, and therefore no ability to share them with others.
It's an interesting way to be and it is the only way I know, but there's reasons that I'm terrible at being a DJ.
Like a jealous plumber, worried that Kim Kardashian's "Break the Internet" photo series will take away from his appeal, hurriedly posting photos of his plumber's crack online...
I had heard it a lot in punk and pop-punk to create swells. I improvised my still-favorite solo that day.
The discovery of feedback tones and the resulting incorporation in the musical experience — a three hour warm bank of tubes turned up to the limit with a maxxed out savant unlocking new realms of sound.
One thing for me to notice is his playing does not require a rhythm guitarist. I discovered that what worked well is Mitch Mitchell as a Jazz drummer his playing was heavily influenced by classics. In a way it complemented Jimi's guitar tone so well.
That would have blown the doors off of everything.
I don't think there was another as "out there" guitar player as Jimi until EVH came along - a little more controlled, but just as confident and chaotic. EVH was quite the systems engineer himself (variac, Floyd Rose later on etc)
Miles always impressed me with his ability to pick the best to back him up, and /then/ let them take the front. Some tracks he barely plays on, waiting minutes for his entry.
Jimmy wanted the best to back him up. But I agree with you; I'm just pointing out why I think he didn't.
Yeah that's a good insight.
Also maybe not until the night of his first big gig there.
Townshend had Marshall build 100 watters so he could play louder clean, Clapton had already been cranking it with a Gibson SG which is a characteristic sound all its own, he was in the audience at the gig and was blown away watching Hendrix.
Every year from at least 1964 to 1984, more advanced amps were made than ever existed before.
Hendrix’s girlfriend Kathy Etchingham claims he never abused her. Some third parties dispute her claims about her relationship.
His arrest record suggests at least some type of altercation with a previous girlfriend but it’s far from clear cut to me.
People are complex and reality is complex. I myself was subject to false accusations about abuse from a disgruntled ex girlfriend (who actually WAS in fact physically and mentally abusive to me and I have the scars to prove it).
But regardless, I have zero issues reflecting on a person’s accomplishments and talents even in the context of them being a horrible person. In fact, I find that part of the intrigue of really talented people. Reality and people are quite multi-dimensional. The only general rule I know is that nobody is perfect and holding up ANYone as some example of moral perfection is almost certainly wrong.
OTOH: why does his accidental fatal pathway tarnish him morally to you?
Very mixed message: "Don't beat women nor vomit!"
One thing I learned after buying some gear at home to try to record electric guitar at low volume is how important the physics of the speakers are. You can plug a tube amp into a cabinet simulator and you'll lose a lot more than using solid state electronics on a good but not great Fender amp, especially if you use fuzz / distortion pedals.
I'm not sure Hendrix was a systems engineer, but he was a transcendent blues artist, that's for sure.
Since the 1980s, we have had the "Sustainiac": an active circuit installed in the electric guitar along with a "reverse pickup" which is energized in order to excite vibration in the strings.
With this device, at the flip of a switch, you get indefinite sustain on any note on the neck, at any volume, distortion or not --- even if the electric guitar is not plugged into an amplifier at all, and just heard acoustically.
The best implementations of this have a three way harmonic switch. You can choose between excite the fretted (or open) note itself (fundamenta a.k.a first harmonic), an octave above it (second harmonic) or a higher harmonic still.
You can be sustaning the given note, and then at the flip of a switch, it will fade over to the higher harmonic.
YouTube videos of this in action are worth checking out.
Here is one:
https://www.youtube.com/watch?v=LZwPPGsxY6g
Nigel Tufnel: The sustain, listen to it. Marty DiBergi: I don't hear anything. Nigel Tufnel: Well you would though, if it were playing.
I've also managed to make an E-Bow work with a steel-string acoustic guitar (but only on one string IIRC).
https://au.fender.com/products/fender-eob-sustainer-stratoca...
You might enjoy this video. He really goes deep into using the guitar to create textures and emotions. He talks about the Edge (U2) and his Infinite Guitar and that he actually calling Michael Brook to see if he could get one. Eventually Fender did a custom build on his Clapton Strat which became the Fender EOB Sustainer.
https://www.youtube.com/watch?v=YK4Fmrlqz3I
All other electronic instruments, with the one exception being the Theramin, have a fundamental problem with human expression. There is an unsolvable disconnect between what the performer's actions and their audience.
See: https://www.scribd.com/document/55134776/48787070-Bob-Ostert...
With an electric guitar you get the physicality and dynamism of an acoustic instrument with the complex timbres and extended technique possibilities of an electric/electronic instrument.
There are complex and musically significant feedback loops occurring across many dimensions that lead to extremely complex transformations of timbre via both traditional music theoretical techniques and the physics of a tube amplifier combined with an inductive load (the guitar pickup).
Its really crazy how much more dynamic and complex this can be then even a highly sophisticated modular synthesizer or whatever. Even the way you over load the power supply in a tube amplifier can be manipulated on the fly to enhance and transform timbre.
Then on top of all that it is so incredibly physical that a performer like Jimi Hendrix can manipulate these systems and have the audience intuitively understand what he is doing. Never in a million years would THAT be possible with any other electronic instrument.
There are always some people who get extremely defensive whenever I say that techno didn't click for me until I heard this kind of "techlow" music. Specifically about the part where I think that the reason is also a human expression problem, because of limitations imposed by the electronic media used.
EDIT: having said that, I don't think I would agree with your premise, because it is colored by a subtle form of survivor bias. None of us remember what it's like to not know electronic guitars or what they sound like, so claiming "the audience intuitively understands what Jimmy Hendrix is doing" is like saying everyone "intuitively understands" their native language. On top of that there's nothing about the workings of an electronic guitar that wouldn't in principle work for something like an electronic violin or whatever.
[0] https://www.youtube.com/watch?v=-0gED3rn2Tc
[1] https://www.youtube.com/watch?v=Mn52b-bWfFM
[2] https://www.youtube.com/watch?v=NYtjttnp1Rs
They do a great job with changing their timbre and tones but often ignore a bunch of other factors that make music interesting. Whether that is the rarity of time signatures other than 4/4, the way certain rhythms are locked into certain genres, the choices of keys used, the limited or missing chords, etc.. at some point you start hearing two electronic songs that sound totally different at a superficial level and you realize they're incredibly derivative of each other.
[1] https://www.youtube.com/watch?v=pfXaL9omQPs
I guess the part people don't like hearing is the implication techno is somehow not expressive. I'm not sure that it lacks expressiveness, but it is certainly more "controlled" than traditional music. When I first heard techno as a teenager in the 90s, my mind was blown. I remember exactly where I was the first time I heard Underworld [1], Photek [2], and Autechre [3]. I think I was attracted to these sounds _because_ they were so different. I think it's hard for electronic music fans like myself to accept the idea that it isn't expressive _because_ it is so different. Isn't it just a different kind of expression?
Still, people like what they like. I'm glad you found a version of dance music that works for you. I've long since moved on being judgmental about people's musical tastes. I think it's just wonderful that music exists at all!
[1] https://www.youtube.com/watch?v=Q5GjVvlmg3o [2] https://www.youtube.com/watch?v=-Xl1xzSRaV0 [3] https://www.youtube.com/watch?v=g6zT3kVtpHc
I think of it more like a painter's palette: every instrument and tool involved in creating music has a different set of colors to choose from, and can also filter some "colors" out if we think of things like audio processing filters.
The tools and techniques typically used to produce techno filter out "colors" that feel essential to me to connect with a song, and yeah, that "controlled" aspect of it is probably a large part of that. That doesn't mean it's not expressive, it's just expressive in a way that I struggle to connect with.
EDIT: funny enough I actually have protanomaly, so my choice of analogy is slightly ironic there. Some visual art and design out there objectively looks terrible from my subjective experience, since the colors look completely off. But that doesn't mean I'm saying the art is objectively bad.
Also, I haven't checked what Juno Reactor do these days, but their old work is phantastic. My fav show of them is Juno Reactor – Shango Tour 2001 Tokyo [2].
For electric violin, I love Ed Alleyne-Johnson [3]. Never seen him live (I'm not from UK) but I own a couple of his earlier works. It reminds me of that time when my dad was in his final years of his lives, and when he finally passed away. Makes me cry every time.
[1] https://www.youtube.com/watch?v=VMIL1YbUQrI
[2] https://www.discogs.com/master/782091-Juno-Reactor-Shango-To...
[3] https://en.wikipedia.org/wiki/Ed_Alleyne-Johnson
just to be clear, Moog synthesizers (and a number of other brands) are electronic yes, but they are analog electronics.
https://www.youtube.com/watch?v=bixtQAq2LzE
And just to clarify: I don't dislike electronic instruments. I just think that on some subconscious level the human brain can detect other humans playing a live instrument. Like there's something "embodied" in the sound that is likely missing from a pure electronic instrument. And I needed that element to "unlock" access to techno.
Tons of work has been done on various modes of humanization by trying to parameterize and modulate these aspects over time. Timing accuracy, velocity variance, chance, etc.
A well-played instrument certainly feels like someone speaking and expressing themselves to you. There are attempts to capture this with MPE instruments such as the Osmose, or Imogen Heap's MiMU gloves.
https://www.expressivee.com/2-osmose
https://mimugloves.com/
How up to date is this opinion of yours? Expression on guitar is pretty intuitive, but modern electronic instrument manufacturers have been working on this problem and created modes of expression that definitely solve this problem.
For example, EWIs allow you to use breath control for expression with many of the same techniques available on actual wind instruments. Also many synths now have features like polyphonic aftertouch, pitch/mod wheels, which allow you to add expression to a note while it is playing. Apps and hardware exist which allow you to use novel methods of capturing motion or other forms of expression. And most modern synths/midi controllers allow you to decide what parameters are affected.
> Then on top of all that it is so incredibly physical
That's an affectation. I can stand on my tiptoes and close my eyes when bending up a note on the synth the same as I can on the guitar. Neither affects the sound, and both are a conscious decision to project an appearance of "I'm really shredding"
> With an electric guitar you get the physicality and dynamism of an acoustic instrument with the complex timbres and extended technique possibilities of an electric/electronic instrument.
That can apply to any instrument once you "electrify" it. What makes a guitar more expressive than a cello or trumpet with a pickup/mic running through effect processing? I play guitar, keys and trumpet, and while I agree that a casio keyboard has limited expression options, your opinion doesn't sound researched.
A whammy bar?
I certainly don’t agree with this as a musician who has tried most of these attempts by electronic music manufacturers.
For one, you can’t easily play two melodies simultaneously across several octaves, using both of your hands, with an electric guitar.
Stringed electronic instruments do have their advantages, but so do the others. Each music making thing has its place in the spectrum.
Two books that have helped me greatly in my musical life, in case people haven’t heard of them, are The Listening Book, and Bridge of Waves, by W.A. Mathieu.
If you're limiting to a 6 string guitar the distance between the two melodies would be limited compared to a piano but guitars don't have to be limited to 6 strings.
Classical guitar is full of this kind of thing.
Having taken piano lessons but being more into guitar I think the thing is almost all people who play piano are introduced to this and it is a core concept in far more piano music than guitar music. But it is not impossible on guitar, and many works for piano that get adapted to guitar require the player to do so.
E.x. there are plenty of players who have studied and played the Well Tempered Clavier on guitar.
Keyboards can approach that with polyphonic touch keys like the Hydrasynth (lean into keys, pressing them harder, for bending the tone in a configured patch), sustain pedals, and pitch bend/modulation controls, but not the nuanced touch of skin on a vibrating string.
I think synth guitars exist, too, but don’t know anything about them. The pedalboards are enough, maybe :)
Of course they exists, just listen to Pat Metheny. There are Midi hexapickups that can play any synth with MIDI and full expression.
Is that really true though? If I watch a cellist play I can pretty clearly see all the things they are doing and it will correlate neatly to the timbre of the sound.
Secondly I think it's important to note the tube amp and the guitar are seperable, and I don't think that their connection is particularly magical. I can reamp a sound from my synthesizer (or maybe a keytar?) into a guitar chain, and if I manipulate the mic and other controls in the same way I might manipulate the pickup, I can also get all manner of interesting feedback effects. My inputs will have different harmonic characteristics of course, and the tube amp's effects are mostly transformations of harmonics; you'll still get some cool tones and they will be subject to a lot of the same rules as if a guitar was being played.
> Secondly I think it's important to note the tube amp and the guitar are seperable, and I don't think that their connection is particularly magical. I can reamp a sound from my synthesizer (or maybe a keytar?) into a guitar chain, and if I manipulate the mic and other controls in the same way I might manipulate the pickup, I can also get all manner of interesting feedback effects.
The story is not quite so simple. Your synthesizer is going to have a buffered output so it wont have the complex impedance loading interactions with the amplifier as the guitar pickup.
This is actually critical to how early distortion effects such as the classic Fuzzface work and imo is essential for the kind of complex timbres you can produce with a guitar + tube amp.
In fact you can take an electric guitar, put a buffer pedal in the chain between your fuzz pedal and amp and completely destroy the ability to produce wild feedback and distortion.
I'm a guitarist, but there's nothing particularly magical about a high impedance signal, other than they tend to lead to noise and make really obnoxious things matter, like how low capacitance your cable is. Also, a TON of modern guitars are low(ish) impedance out because they use active pickups.
The pedals and system being dependent on the high impedance was always a bug, not a feature, and make the setup incredibly dependent on variables that really wouldn't be that hard to just buffer then recreate deterministically. Like, if your pedal should react to that impedance just buffer the front, put a big inductor (or a transformer using only half, or, - and I've actually seen this - just a whole guitar pickup) in the pedal. Then you're not dependent on the pickups of the guitar or the capacitance of cable or anything else and you can make sure the effect sounds good regardless of pickup type.
That is going to be something like a transformer to step down your line level signal and some series resistance to match the load to help drive the amp.
An actual coil pickup has reactive impedance that is frequency dependent and will result in a more complex interaction between the devices.
> The pedals and system being dependent on the high impedance was always a bug, not a feature
Sure if you think like an engineer, but everything you are complaining about is what allows someone like Jimi Hendrix to do what he did with a guitar.
You can hear it particularly on "Where I End and You Begin" from Hail to the Thief. Ed O'Brien compliments its sound using an EBow (back before he had the sustainer) in that song.
https://en.wikipedia.org/wiki/Ondes_Martenot
Electric bass? Heck, even in synthesizers, you have the EWI or the Haken Continuum.
Guitar (and bass) are obviously and far and away the most successful, but it does a disservice to a number of wonderful inventions to say they're the only ones. Just look at what the Japanese band T-SQUARE does with the EWI to see people innovating at the edges.
But is it one of the most versatile instruments? You can do signal transforms with any kind of audio input, although it's done more with the electric guitar than any other instruments.
I would say it in practice, it has the most versatile sonic profile.
With the right interface, I think the synth can be more expressive. Look at the Haken Continuum or ExpressiveE Osmose - both can be used with something like the Expert Sleepers FH-2 to get MPE data to the modular.
I do see your point, and agree the amount of articulation you can do with guitar is hard to beat, but I do think a synth can win, if the setup is built for it.
I remembered learning about similar MIDI controllers when I was in school.
https://mimugloves.com/gloves/
https://www.youtube.com/watch?v=vq52kT6YY-0
I often lament the lack of other electric instruments.
Look at Roli Seaboard, it has insane amount degrees of freedom / expression
https://youtu.be/2fQbtp2BgY4?si=S52A-22A3GlXPajU
past the middle starts solo
Synth music elevated electric bound tones to anything ever heard.
I remidn you that most of the rock and roll and rock music was about speed and mimicking the sound of a rumbling car engine, as it was a symbol of the freedom in America, being able to run away from your toxic communities to find yourself better anywhere else.
That was the message for the young with rock and roll: a speedy engine for your ears.
Electronic music was like replacing a car with UFO evoking you a space travel.
With the progressive subgenre of techno music you got the same feeling, but with no subtle hints. Heck, one of the most known songs in Spain ever, "Flying Free", literally remixes the sounds of drifting cars between the melodies, making the listener really happy in a very direct way as tons of youngs in the 90's got into the outskirt night clubs... by car. So they felt as driving an infinite highway rave with no end for days.
https://youtu.be/roBkg-iPrbw
Few program synthesizers. Most just use presets. Infinite freedom is paralyzing. Building blocks are comfortable.
Jimi was left handed
>Electromagnetic pickups—(...)—fixed the loudness problem. But they left a new one: the envelope
Was it really a problem to be solved? Good tube amplifiers already existed back then. Clean guiar tone was not something frowned upon.
>Hendrix’s mission was (...)
>His solution was (...)
I don't think Hendrix was on a 'mission' to solve engineering puzzles at all. He was just experimenting, as an artist.
1,000,000%. Guitar is one of those hobbies where people mythologize and build elaborate hagiographies around players they like and the gear that they used. Hendrix was a generational talent but I highly doubt he was sitting around enumerating problem statements and systematically exploring solution spaces. The Fuzz Face was one of like four dirtboxes available during that time so he chose that one. He flipped a guitar upside down because he could source one more easily than a lefty model. He leveraged feedback because he discovered it naturally and realized that he could make it sound totally badass.
The man clearly had a vision and executed it but his decisions were pragmatic, not the product of grand technical reasoning. It reminds me of the student who wrote a bunch of authors and asked to what degree they were conscious of the themes and symbolism in their work [1]. Many were not - as it turns out English teachers often put the cart before the horse. This is the rock and roll version of that.
I can't knock the article though as it has a lot of sound (pun intended) analysis in it as opposed to typical guitar forum dreck about NOS tube and hand-wired turret board magic.
[1] https://www.theparisreview.org/blog/2011/12/05/document-the-...
I've read that he claimed he played a right-handed guitar upside down because his father was superstitious and didn't like him doing things left-handed, so he'd play a right-handed guitar upside down most of the time and flip it over when he needed to play in front of his father. (I'm not sure why he didn't play a lefty guitar upside if that was the case, but I could imagine that the availability might be relevant like you mentioned, or maybe his father was familiar enough with guitars to be able to recognize a left-handed one and figure out what was going on, or maybe because he was better left-handed he could play it upside-down well enough but due to not being right-handed he would have found it more difficult to play it in the non-standard way).
I was speaking with my 14 year old nephew via messaging last month. It was about a deep topic, synthetic consciousness. He wrote such an intelligent reply that I asked him: hey, was this from an LLM? He was insulted. I did research with his parents and found out that 90% no, he's just a very smart kid.
Is there a name for this this mode of confusion yet?
The insulting didn't end there. You asked his parents! Even then you only landed at 90%, yet another insult because why can't he earn 100%? Ethical dilemmas on all sides!
I like some of the ideas in the article but there are some very "it wasn't just A, it was B" sentences in there. IEEE has a higher standard.
I think LLM's lack of "theory of mind" leads to them severely underperforming on narration and humor.
Seriously there's no LLM stuff in here. Only emdashed which were used in journalism decades before AI was even a thing.
I love how nonchalantly you threw this one in. I am proper jealous, how was it?
On your first remark, I agree. This is why I love Dire Straits and Mark Knopfler. The studio recordings are amazing, and then you listen to their live stuff and it's even better.
https://jimihendrixrecordguide.com/home-recordings/
(edit: syntax)
Voodoo Chile lyrics: "on the night I was born the moon turned fire red".
Poetic license? Stellarium reveals on the early evening of November 27, 1942 in Seattle, the moon was low on the horizon - just 25 degrees altitude at 5:30pm, directly East. The sun set at 5pm. While not a full moon it was 85%, so I'm calling it! The moon may have glowed a warm orange-red on the night (of the day) Hendrix was born.
I suppose if I were going to recommend a single episode to Hacker News though, it would be https://500songs.com/podcast/episode-146-good-vibrations-by-... which begins with at least a half hour on the amazing (if not happy) life of the guy who invented the Theremin, Lev Sergeyevich Termen.
What’s the newer song you mention with the flat fuzz?
The guy built his own guitar as a teenager and has played it for the rest of his career: https://en.wikipedia.org/wiki/Red_Special
https://www.youtube.com/watch?v=LHMv0ORn0hg
While some try to make it as exact science, it is not, there are things you still cannot put a number on and it works ...
Considering the amount of watts involved in this, it probably qualifies as rocket science haha
JM: Yeah. The tremolo sound from the intro? That was four Fender Twin Reverbs. Myself controlling the speed of two of them and the producer controlling the speed of the other two. So two amps were recorded on one side of the stereo and the other two on the other side. I recorded the part on the tape without the tremolo, and then I sent the part from tape out to four amps, and he controlled two, and I controlled the other two.
And it took a long time because inevitably the tremolo would go out of time with the track because the tremolo doesn't stay in regular clock time. Also we would go out with each other's amps, so we had to keep looking up at each other after every fifteen second bursts and kind of fess up, "Oh yeah, mine kind of went out of time." It took long time, but I'm glad we did it that way because if we had cut and pasted two seconds of audio, it wouldn't have had the same dynamic quality throughout the six minutes of the song, or however long it is.
https://www.westword.com/music/johnny-marr-on-how-he-created...
DSP, Control Engineering, Circuit Design, understanding pipelining and caching, and other fundamentals are important for people to understand higher levels of the abstraction layers (eg. much of deep learning is built on top of Optimization Theory principles which are introduced in a DSP class).
The value of Computer Science isn't the ability to whiteboard a Leetcode hard question or glue together PyTorch commands - it's the ability to reason across multiple abstraction layers.
And newer grads are significantly deskilled due to these curriculum changes. If I as a VC know more about Nagle's Algorithm (hi Animats!) than some of the potential technical founders for network security or MLOps companies, we are in trouble.
Isn't Nagle usually introduced in a networking class typically taken by CS (non-CE/EE) undergrads?
Just because EEs are exposed to some mathematical concepts during their training doesn't mean that non-EEs are not exposed through a different path.
Networking, OS, and Distributed Systems is increasingly treated as CompE or even EE nowadays in the US.
> Just because EEs are exposed...
That's the thing - I truly do not believe that EE and CS should be decoupled, and I believe ECE as a stopgap is doing a disservice to the talent pipeline we need for my verticals to remain in the US, especially when comparing target American CS and EECS programs to peer CEE, Indian, and Israeli CS programs [0].
There is no reason that a CS major should not be required to take a summary circuits, DSP, computer architecture, and OS fundamentals course when this is the norm in most CS programs abroad. Additionally, I do not see any reason for EEs and ECEs to not take Algorithms, Data Structures, and Compilers as well.
> Just because EEs are exposed to some mathematical concepts during their training doesn't mean that non-EEs are not exposed through a different path
Mind you, I'm primarily in Cybersecurity, AI/ML infra, DefenseTech, and DeepTech adjacent spaces - basically, anything aligned with the "American Dynamism" or Cyberstarts thesis.
From what I've seen, the most successful founders are those who are able to adeptly reason and problem solve, but are also able to communicate to technical buyers because you are selling a technical product where those people make the decision.
Just because an approach isn't useful today doesn't necessarily imply it isn't in the future and being exposed to those kinds of knowledge and foundational principles makes it easier for one to evaluate and reason through problem spaces that are similar but not necessarily the same - for example, going to the Nagle's example - this was a bog standard networking concept that has now become critical in foundation model training because interconnect performance is a critical problem which can impact margins.
A lot of foundational knowledge is useful no matter what, and is why we fund founders and hire talent at competitive salaries.
[0] - https://news.ycombinator.com/item?id=45413516
I've never needed or benefited from most of the EE curriculum. There is an opportunity cost in learning things you don't need.
I know at MIT it was (and I think still is) one major - EECS, and students had substantial latitude on how much they wanted to concentrate into hardware or software at least after the intro courses.
Artistic endeavors come from lots of places, not just people with an analytical mindset. Historically those two are seen as opposing tendencies, which I think is unfair, but it points to the importance of intuition and navigating perception and emotion for artists.
Music theory, Nashville notation
> attempts to model elements of the problem?
Ditto
> data gathering and data analysis?
Listening to a wide variety of music and understanding what make a genre a genre
> simulation?
Cover songs, writing to a style
> Intentional application of principles of physics or some other pure domain to a real world problem?
Literally sound engineering
I’m curious because to tell you the truth the novelty struck me as similar to comparisons I’ve toyed with using LLMs on my own. The AI-generated logic between comparing two dissimilar things is too sterile for my liking.
I understand that this is appearing in technical publication, but for some reason that invites even further scrutiny on my behalf.
Please share more reasons behind your suspicions.
> and the component was the Octavia guitar pedal, created for Hendrix by sound engineer Roger Mayer.
So, Roger was the engineer. And, Jimi was the artist.
And if we can call ourselves software engineers, where our day-to-day (mostly) involves less calculus and more creative interpretation of loose ideas, in the context of a corpus of historical texts that we literally call "libraries" - are we not artists and art historians?
We're far closer to Jimi than Roger, in many ways. Pots and kettles :)
Just because I drive my car with immense focus, make precision shifts, and hit the apex of all of my turns when getting onto and off of the freeway doesn't make me a race car driver.
Engineers don't just feel good vibes about science and mix it into their work. It is the core of their work.
Simply having a methodology absolutely is not sufficient for being an engineer.
And great, you have an arbitrary system of ethics, like everyone does I imagine. But no one holds you to these ethics.
Further down there is a sentence: "First, the Fuzz Face is a two-transistor feedback amplifier that turns a gentle sinusoid signal into an almost binary “fuzzy” output." But the figure does not match this - there is no "gentle sinusoid" wave shown on the first fuzz face plot.
Any guitarist in a 1940s big band would have a big hollowbody guitar and an amp. That combination is incredibly prone to feedback. Everyone worked to reduce feedback and avoid it. That's what I do with my hollowbody when I play with a big band. It's the first thing that happens when you turn up.
Hendrix did not "discover" feedback, and in fact he did not discover the musical uses of feedback - you can hear it in BB King records that predate Hendrix, where feedback makes his notes "sing."
What Hendrix did was turn feedback into an intentional musical creation that he treated as a melodic voice.