Implicate Orders

Implicate Orders

Share this post

Implicate Orders
Implicate Orders
Chapter Notes: Part II, Chapter 3

Chapter Notes: Part II, Chapter 3

Primary Source Extracts & Notes to Self

Ben Loomis
and
Ben Loomis 2060
Jul 02, 2025
∙ Paid
2

Share this post

Implicate Orders
Implicate Orders
Chapter Notes: Part II, Chapter 3
Share

NB: When writing a history, the hours spent in research tend to outweigh those spent shaping the narrative itself. Along the way, you collect all kinds of material — primary sources, theoretical reflections, marginalia — that never make it into the final cut. These “Chapter Notes” are for readers who want to know more about the people and events behind the story, and who don’t mind wandering down a few adjacent corridors.

(Free subscribers get a glimpse; paid subscribers ($5/year) get full access.)

~

Notes for Part II, Chapter 3 include a timeline of technological developments in Interfaces from 2027-35, as well as severalexcerpts from primary sources related to Gamework and the underbelly of the economic changes during these times, including (at the end) the article where I first learned about Derek Rhys.

~

In Chapter 3, I included a brief timeline showing the upgrades to Simspace related equipment (then called “Virtual Reality” or “VR”). It was adapted from a larger timeline I created when researching development of Interface tech for this history. Here is an abridged version of that timeline, which includes hardware and software developments as well as notes about adoption and important early use cases.

(Note that while we now generally see Simpace and Layerspace as related options within Interface hardware choices, up until around 2032, both “VR” and “AR” (for “Augmented Reality”) were the standard terms and appear in the primary sources. I use all four terms in the below.)

Interface Timelines:

Eyewear:

2027

MetaMind launches (Meta) — first mass-market interface kit, includes AR glasses and earbuds. High price, elite adoption.

2028

Google Genius and Microsoft Harmonic enter market. Prices begin to fall.

Meta introduces VisionLite — streamlined AR glasses for students and gig workers.

2029

Interface glasses reach mainstream adoption in WEIRD societies.

Public wear still socially awkward, especially in face-to-face conversation.

“Eyewear etiquette” guidelines circulate (e.g. “Tap to Dim” and “Remove at Intimate Meals”).

Cognitive Networks (Cog-Nets) launch — often require visual layers for full participation.

2030

First prescription-integrated frames hit the market.

Sport-style wraparounds popular among logistics, trades, and VR gamers — better sensor arrays and broader field overlays.

2031

Mood-adaptive lenses released — subtly adjust brightness, contrast, and symbolic cue density based on user state.

Social segmentation accelerates: interface eyewear now reflects status, job type, even subculture.

Fashion brands launch designer AR frames; eyewear becomes status marker like watches once were.

2032

First viable layerspace contact lenses demonstrated (InSight, Mojo) — real-world limited rollout.

2033

Layer density in everyday life increases: signage, commerce, legal disclaimers, even graffiti encoded in visual overlays.

Indie backlash emerges — wearers of “dumb glasses” protest symbolic manipulation.

2034

Early trials of brain-computer-linked displays begin. Too invasive for most users.

Social default: interface glasses worn continuously in professional-class settings.

2035

By October: over 80% of adults in networked economies wear interface glasses regularly.

Earbuds:

2027

MetaMind kits ship with dual-channel smart earbuds — capable of localized sound staging, tone modulation, and passive listening.

Always-on listening framed as “contextual awareness.”

2028

GeniusPods (Google) and Harmonic Echoes (Microsoft) released — designed for comfort over long wear.

Voice interaction becomes standard across productivity apps.

2029

Bone-conduction audio enters mainstream — allows open-ear experience with clearer ambient sound.

Meta’s earbuds update includes bioacoustic profiling — mood, stress, vocal fatigue.

First widespread use of “whisper coaching” in meetings, pitches, and customer service.

Users begin wearing buds all day. New social norm: “the polite remove” (taking out one bud to signal attention).

2030

Directional 3D audio and spatial awareness mapping standardize — the Voice can now “place” itself near the user’s head based on emotional tone or task urgency.

2031

Subdermal tethering introduced — earbuds pair with smartwatches and biometric patches to fine-tune Voice delivery in real time.

Entry-level “Companion Voices” introduced in schools and assisted living facilities.

Earbud etiquette collapses — constant wear no longer considered rude in most settings.

2032

High-end models offer custom acoustic environments — e.g. calming background textures during focus work.

First non-removable earbud implants trialed in South Korea and UAE — luxury class only.

2033

Earbuds now worn for 12–16 hours/day by majority of networked professionals.

“Silence anxiety” becomes a measurable condition; users report distress when the Voice is unavailable.

2034

Neural whisper transmission (NWT) tested — non-vocal, sub-auditory cognitive prompting via ear canal stimulation.

Pairing with dream-training systems begins among elite creatives and military groups.

2035

Voice no longer considered a “feature” — it’s a presence, integrated into thought processes.

Over 90% of networked adults report “daily co-narration” from their Voice.

Biometric Sensors:

2027

Meta kits ship with optional wristband monitors — tracking heart rate, temperature, and galvanic skin response — and marketed as “wellness optimization.”

2028

Biometric integration becomes standard in interface kits — even entry-level models include pulse oximeters and voice-stress analyzers.

Consumer fitness wearables (eg, Oura, WHOOP) partner with interface providers to share data.

2029

Adhesive chemical patches emerge — disposable skin-contact sensors that detect cortisol, glucose, and hydration.

Voice begins adapting delivery style and content in response to stress spikes or fatigue.

2030

First full-body adaptive haptic feedback systems use biometric input to modulate pressure, rhythm, and temperature in interface responses.

Hospitals and elite employers adopt closed-loop systems that respond to user vitals dynamically — stress-reduction protocols, task reprioritization, ambient cues.

2031

Biometric compliance scores rolled into Cog-Net performance metrics. Users who “self-regulate well” receive perks and elevated trust ratings.

Pilot programs in Japan and Dubai trial long-term subdermal implants for executives and continuity-of-government personnel.

2032

Mass-market neurochemo patches available — manage dopamine, focus agents, even subtle mood shifts based on task type.

“Chemical transparency” becomes a debated workplace issue: Should coworkers see your cognitive state?

2033

Biometric sync contracts introduced in elite teams — you agree to let your system adjust emotional tone, microdosing, and feedback frequency based on shared mission state.

Implants now standard in defense, aviation, elite finance, and GameWork immersion workers.

2034

Internalized self-regulation systems emerge — biometric implants connected to cognitive overlays can subtly nudge behavior before the user becomes consciously aware of discomfort or distraction.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Ben Loomis
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share