Electronics lets blind artist keep doing work he loves July 30th 1981 Arizona Daily Star Artice – by Carol Sowell – photo by Ron Medvescek

Gordon Fuller is a man of vision.

Despite being legally blind because of a progressive eye disease, the 27-year-old artist is working to create a future of exciting possibilities for Tucson, for other artists and for himself.

In Fuller’s view of the future, the technology of electronics and computers will generate interest in local artists and strengthen Tucson’s sense of community.

The medium for making those dream true is cable television, said the resident of Tucson’s northside. As a member of the Tucson Commission of the Arts and Culture, he wrote the commission’s position paper on cable television. It is one of the research materials the city council is studying before awarding a cable-TV franchise.

While working on the paper, Fuller became knowledgeable about community access cable television. As a result, he’s invited to speak at national conferences, and acts as a consultant throughout the country on cable-TV programming by community groups.

Fuller, a Sedona native, said his lifelong ambition was to be an artist. He began studying art in childhood, and received a fine arts education in Amsterdam, Netherlands at age 14.

But his plans changed when Fuller was 18. He learned that his eye disease, retinitis pigmentosa, eventually would deprive him of his sight. He reacted to the news with, “total and utter shock,” and felt “useless, futile and hopeless,” Fuller said.

Believing a career in art would be impossible he took courses in Business at Arizona State University, journalism and public relations at Northern Arizona University, now art and computer science at University of Arizona.

At 20 he moved to Phoenix and opened a graphic-arts business, working with advertising and public-relations agencies.

Although successful running his ad agency Fuller realized, “I was not fulfilling my promise as an artist.” By that time Fuller could no longer see well enough to drive, but he decided to be an artist anyway.

Four years ago he returned to Tucson where he had lived as a child, and promptly became involved with the local arts scene. He has been learning about computer graphics with his brother, an astrophysicist and data scientist and decided, “Digital was an opportunity to create a new medium for artists.” Fuller discovered that his eye condition was suited particularly to working with the bright light of a video screen.

Now, collaborating with engineers, computer scientists, musicians, television technicians and others, Fuller produces video art electronically. With others interested in electronic art, he formed the Art/Science Center, a 501C3 non-profit community-service arts organization that produces video programs for community television. The center’s projects are financed by grants and in- kind contributions from industry, he said.

Sponsors of Art/Science Center allow Fuller to buy time on computers at the University of Arizona or he borrows experimental equipment from local labs and businesses. He is “keenly interested in exploring the creative possibilities of computer technology and the collaboration of arts and sciences,” Fuller said.

This spring, he trained a crew of youths who were deaf and/or blind which Fuller directed producing a video documentary of Tucson’s “Very Special Arts Festival”. The festival featured handicapped artists from around the state performing or demonstrating their talents.

The project led to he and his crew’s Memorial Day appearance on “Good Morning America.” The Kennedy Center for The Performing Arts sponsors the “Very Special Arts Festival” for which Fuller directed Arizona’s festival promotions and public relations. The cable- television project has absorbed most of Fuller’s time and energy for the past two years. He said his hours of discussion and study will be reflected in the City Council’s franchise decision. Although the arts commission does not plan to endorse any applicant, Fuller pointed out, “All 12 cable companies competing have proposed local arts and cultural channels.” Such a local arts and culture channel “can foster a great renaissance in terms of community access to local arts and culture. It can create national and international audiences interested in the arts here.”

When he is not involved with cable-TV or his electronic art, Fuller paints, draws, and writes science-fiction stories. Fuller is comfortable at the forefront of his many fields of interest – “I’ve always looked at everything much more closely than ordinary people do.”
Fuller describes his attitude towards his eye problem by recruiting the theme of the international Year of the Handicapped (1981). “Being handicapped is a state of mind” he serves as regional publicity chairman for the United Nations project.

Of his eye condition, he added, “It’s very difficult for me to be aware that my vision is impaired. It’s like any other personal characteristic. It caused me to be who I am. It’s given me some great insights into visual perception and the way the brain works.”

Fuller said he has taken advantage of all that technology has to offer in adjusting to his eye problem.
“Today, I live each day as it comes and work with what I have. It’s a battle I’ve won and don’t have any concerns with.”

Welcome to Fullervision

Where Artistry, Futurism, and Synchronicity Converge


Fullervision is more than just a platform; it’s a call to action by Our Future.

Co-hosts Gordon Fuller and Kealoha Bower invite you to explore new horizons with the FutureSense Podcast. Dive into inspiring stories and discussions that unveil Visions of a Brilliant Future. Join us as we weave a new social fabric of collaborative intelligence participatory engaged citizens, from the grass roots to the grass tops.

Meet our Team

  • Gordon Fuller
    A visionary, artist, and futurist, Gordon’s unique foresight and philanthropic endeavors shape the essence of Fullervision.
  • Kealoha Gardener
    Bringing a wealth of knowledge and passion to any project/endevor, Kealoha co-pilots the journey into the future with Gordon, engaging guests and audiences alike.

[Contact Us] [Become a Patron] [Join Buy Me a Coffee]

The smart city is a perpetually unrealized utopia | MIT Technology Review

, the Jesuit social historian Michel de Certeau suggested that resistance to the “celestial eye” of power from above must be met by the force of “ordinary practitioners of the city” who live “down below.”

When we assume that data is more important than the people who created it, we reduce the scope and potential of what diverse human bodies can bring to the “smart city” of the present and future. But the real “smart” city consists not only of commodity flows and information networks generating revenue streams for the likes of Cisco or Amazon. The smartness comes from the diverse human bodies of different genders, cultures, and classes whose rich, complex, and even fragile identities ultimately make the city what it is.
— Read on www.technologyreview.com/2022/06/24/1053969/smart-city-unrealized-utopia/

Beyond Mere Decentralization – Orthogonal Web | PLAN Systems

Beyond Mere Decentralization – Orthogonal Web
B.DWALL Plan is a new Operating System built on distributed computation and trust. Unlike legacy OS such as Windows or Linux, Plan OS eliminates the reliance on centralized storage or trusted 3rd parties. It overcomes legacy problems in defense and commerce.

Plan OS is the natural outcome of several disruptions:

1) peer-to-peer file sharing, pioneered by Napster in 1999

2) peer-to-peer trust, exemplified by Santoshi Nakamoto, in 2009

3) pervasive connectivity through 5G and Ultrawideband, in 2020

4) pervasive computation through smart devices (or IOT)

These disruptions have the potential in defense and commerce, which we’ll address shortly. They are all disruptions that appeared within this century. The problem is that legacy OS is built around a computation model that was developed in 1940’s. What’s needed is a greenfield effort to build an OS built upon the disruptions of today — of smart environments and distributed trust.


The new battlefield is the internet of things (IOT). Stuxnet ushered in a new era of attacks, which ranges from smart devices in your home to the national power grid. A new kind of OS is needed to insulate devices from bad actors.


We live and work in environments that are getting smarter. Architects are incorporating Building Information Modeling, or BIM, into their plans. Civil engineers are planning smart cities. Service companies are attaching information to spaces. Consumer companies are selling smart assistant devices. A new kinds of OS is needed to balance the needs of governance with privacy.

In the era of intelligent spaces, we need an OS that helps preserve the right to life, liberty, and the pursuit of happiness.


There is an obvious need to build scalable resilience mechanisms into our social safety net. We no longer have the time or luxury to ignore the current gaps in privacy, accessibility, and collaboration. Now is the time to develop systems and tools specifically for communities in need, to provision for privacy and universal inclusion, and to put forward effective strategies for localized problem solving and communications.

– The ARPAnet, a DARPA.mil Creation
– The Topologies of Decentralization
– Making Use of the Tools We Have
– Is ______ THE solution?
– Techno-Socratic Dialectic
– The Orthogonal Web
– Resource Mapping and the Consent of the Governed

If we follow the trends of innovation, the future of computing the starts to look like incredible immersive experiences powered by distant servers. However, powerful surveillance ready systems are becoming household items, while the infrastructure the internet was built on is still fundamentally insecure and incomplete.

With so many cutting edge technologies being developed, from “the cloud” and IoT, blockchains, to DLT, DHT, stacks, and DApps, it can be hard to keep track. Let’s take a step back for a second and ask the question, what even is decentralization. Why should anyone care about any of this at all? In this article we’ll go back in time to uncover the nuances of networks and explore concepts beyond mere decentralization.

The ARPAnet, a DARPA.mil Creation
The Advanced Research Projects Agency’s Network
Recall from any other Internet History primer, that the ARPAnet (progenitor of the Internet) was funded and implemented so that U.S. war fighters could employ encrypted intranets and low-bandwidth communications (using TCP/IP) to better defend against a nuclear armed Soviet Union. It enabled decisive decision making by offering commanders near real-time information exchange and “command & control” (C2) systems. Due to its generalizable nature, the applications of this technology were endless.

Once trained and equipped, soldiers and sailors could quickly relay commands, orders of battle, time sensitive information, unit reports, and plan operations spanning many locations at once. Having additional interoperable “high-level” protocols meant that the kind of I/O link or a link’s security integrity was no longer critical because security and data packaging occurs in the layers above TCP. This made it trivial to deploy and secure networks given the hardware encryption devices that were available at the time. “The goal was to exploit new computer technologies to meet the needs of military command and control against nuclear threats, achieve survivable control of US nuclear forces, and improve military tactical and management decision making” (S. Lukasik, 2011).

“The goal was to exploit new computer technologies to meet the needs of military command and control against nuclear threats, achieve survivable control of US nuclear forces, and improve military tactical and management decision making.”


Ocean of Things (OoT) Advanced Data Analytics – DARPA
The idea of “centralized control, decentralized execution” began taking root in military doctrine once some of these foundational systems were in place. One of the ways this doctrine manifests to accomplish the stated goal is in the adoption of software and hardware tools and interfaces that are secure and easy (enough) to use even for the lowest ranking service-members, who at the time had likely never seen a computer. To this day, access is granted based on levels of trust and verification established by the National Security Act of 1947.

With network privacy built into these new systems using end-to-end hardware encryption, a new style of secure collaboration was made possible. With additional open protocols like HTTP and HTML by the early 90’s, the task of sharing critical information between many large organizations was made much more simple. Eventually, Intelink was born, a group of secure intranets that served as a hub of information for services, agencies, engineers, operators, and analysts. Anyone with the proper clearance to the network could access common tools & databases, locate source reporting, coordinate with experts in the field, or even read the President’s Daily Briefing.

The Topologies of Decentralization
Internet Relay Chat, or IRC, was one of the first widely deployed chat protocols that improved communications between commanders and field units using secure, multiplexed chat channels adaptable for any team (just ask Slack how awesome it is). U.S. armed forces use tools like IRC to this day in places like Afghanistan and Iraq to send and receive urgent messages such as 9-line medical evacuation (MEDEVAC) reports, troops in contact (TICs), resupply efforts, and to support quickly unfolding activities and decisions. IRC is just one example of a simple and flexible tool that’s highly effective, as long as you can implement and secure it.

From C2 to C5ISR – A long and historic journey of developing command, control, communications, computers, cyber, intelligence, surveillance, & reconeissance collection and sharing capabilities within a trust network.
“Decentralized execution is defined as the ‘delegation of authority to designated lower-level commanders’ and other tactical-level decision makers to achieve effective span of control and to foster disciplined initiative and tactical flexibility.”

HH-60’s in Flight | Joint Combat MEDEVAC
Afghanistan March 2007 Helmond Province Flooding
Tactical Operations Center running mIRC chat + radios
Dutch front-line air controllers at an emergency command post above the flooded Helmand River
GIS and AI Systems Assist with Fighting Wildfires
CA. Firefighters Help Australia Battle Bush Fires
Whether operating in a remote environment or fully connected to peers, it’s imperative that mission oriented operators need tools that help better understand the environment; this is especially true when peoples lives are at stake.
It’s important to identify, there are different contexts in which decentralization can arise, from an organization decision making perspective, the physical infrastructure of a network, to a software’s architecture implementation. All that to say, terms like “decentralization” and “distributed” actually aren’t that helpful without also having all the appropriate context of how the systems are put into practice.

It is now common practice for software companies to require users to give up vital private data, devise lock-in tactics, or require the internet to even function, while also being black boxes of present and future risk. Although “cloud” technologies make using and sharing information vastly easier for end users, there are still major privacy and accessibility gaps – especially from a social inclusion standpoint. For example, where are the open protocols and adaptive interfaces that enable spatial navigation for the blind and those with limited mobility? How can businesses, cities, and communities better serve people with disabilities using open protocols, information visualization, and data sharing? What happens when the internet or power goes out?

Even some of the most widely recognized “decentralized” infrastructures are not as decentralized as they appear (ref: Bitcoin & original research by Cornell); additionally, they can require highly restrictive licenses and maintain proprietary elements that require buy-in on others’ business or governance model. The questions I always like to ask are, who owns the data, who can access that data, and where does it reside? Identifying this pattern can be the first step in trying to decide where you or your community fits into all of this confusion progress. 

For an introductory understanding to the subtle differences between decentralization and distributed systems, check out Julia Poenitzsch’s article What’s the difference between Decentralized and Distributed? And how does this relate to Private vs. Public blockchains?, published on Medium, Oct 3, 2018. “There are degrees of decentralization and distribution, rather than hard divisions. How much decentralization or distribution is desirable then depends on your objectives.”

Decentralized vs Distributed – The Basics
Making Use of the Tools We Have
It is no coincidence that tools with profound utility are also broadly applicable to people everywhere, whether in business, emergency response, or “going off to college”. Today more than ever before, the importance of being able to connect and transact easily, dependably, and securely is paramount, whether or not it’s obvious. Both the military intelligence community (MIC) and the Silicon Valley behemoths were created on the backbone of open source initiatives like Linux, TCP/IP, HTTP, and HTML protocols. How did so many people (the non-technical masses) miss out on owning and employing these tools directly?

Unfortunately, it has become all too easy to rely on a handful of platform gatekeepers. The technology field has been dominated by companies like MSFT, APPL, GOOG, AMZN (etc.), with device-specific operating systems, cloud-services, apps, and web browsers as the de facto standard. But with the rise of powerful open software languages, distributed computing, and 3D graphics, the equation is rapidly changing. Major development efforts are underway in this nascent space, including projects like Holochain, Polkadot, EOS, Hyperledger Indy, and many others.

However, I think it’s pertinent to keep in mind that the infrastructure that you decide to use is part of a much larger integration puzzle. It’s also important to make these infrastructure and data tools meta-data secure, and usable (e.g. with an interface) by communities and their members for building additional resilient networks, with flexibility to centralize or decentralize depending on the needs of the network.

Is ______ THE solution?
Projects like Holochain, DAT, and Substrate have emerged as open and flexible distributed infrastructure software, offering a decisive advantage over proprietary / closed technologies. Holochain provides infrastructure for developers to build apps on their peer-to-peer “DHT” layer called a distributed hash table. Like PLAN, Holochain features a private instancing model, meaning that any group or community can use it without becoming otherwise dependent on other people, groups, or organizations.

In contrast, many distributed technologies ultimately keep a proprietary element behind lock and key, a paywall, or entwined with a digital currency. It’s important to point out that while PLAN is designed to be self-contained, it can also integrate with other existing DLT / Blockchain layers that are compatible with our architecture and Design Principles. PLAN is a kind of technology glue that brings the components of a system all together, including infrastructure, a complete security data-model, and interfaces that are required for non-technical users to participate (UI/UX).


Holochain = Distributed P2P Infrastructure

PLAN = Integrated Platform with Pluggable Components

A component based approach works well for PLAN because it allows the end user to choose a configuration for their local needs, whether on or off-grid. As we’ve seen in many blockchain-specific architectures, what happens when the community’s needs outstrip the conceptual models imposed by the distributed infrastructure layer? What happens when the models used at the distributed infrastructure layer are inherently too much complexity for community end users? And regardless if we all agree on how pluggable the bottom-most infrastructure layer should be, what about usability? If there hasn’t been an intentional vision or plan about a device operating system AND infrastructure-agnostic user experience, then how far along are we really? As an example:

With these barriers to entry in mind, there is still the task of building usable interfaces and maintaining integration for all the different OS’s, platforms, and browsers that are out there. Given that they all come with dependencies and layers of complexity, which ones do you target? While having dependencies is generally acceptable when deriving value from a particular network or specific kinds of applications, it’s just not practical to assume app developers will ensure critical communications infrastructure is usable, private, and universally accessible. This is especially true when the demand to capture that data is so high.

If there hasn’t been an intentional vision or plan about a device operating system AND infrastructure-agnostic user experience, then how far along are we really?

AI is Helping Fight Wildfires Before They Start — Time Magazine -> But what kind of tools do these communities have?
Techno-Socratic Dialectic
Is it realistic that all communities will forever rely on a particular solution as the ideal for managing critical privacy needs, connectivity, identity, value exchange, or trust? What is the cost of finding out the hard way? Do we just discard the principle that a key part of this tech is that communities choose the right components best for their needs? A team of journalists doing a high-risk expose are correct to demand a peer-to-peer storage layer that will have rather different security and performance trade-offs than say, a community healing center, a city block organizing a weekly flea market, or a crafting guild with accessibility needs for seniors and people with disabilities.

Why Not “Decentralization”?

It connotes a process to disrupt the status quo… but suggests no vision of a better thing to replace it with.

It suggests a topological fix… but are our true problems merely topological?

So what’s next for “decentralization”? Is it going out of style before it even got popular? I would say yes, hopefully for all of our sake. There is actually a much more nuanced and actionable approach available than mere decentralization. While there is no common parlance that I can reference to provide an immediately satisfying understanding, I will refer to this approach as The Orthogonal Web, a concept coined and articulated by Peter Wang, CEO of Anaconda, during a lightning talk at DecentralizedWeb (DWeb) Camp 2019 (led and sponsored by the Internet Archive).

The Orthogonal Web
Peter points out there are “three critical elements of a communication and information system that need to be held orthogonal to each other”: Data | Transport | Identity. Key to understanding this Privacy Trinity is that “all three legs affect each other, but all three legs need to be put together in an orthogonal way … 90 degrees from each other so they can not be used to capture the other.”

For example, if you are sending a sensitive email using a transport method that relies on Gmail infrastructure, which retains part or all of your message on their servers, the identity and content of the message is tacitly exposed to Google, and by extension any one else that is able to gain access to those lines of communication. The key takeaway from Peter is that with any conventional infrastructure built on top of the Internet (aka ARPANet), that orthogonality does not exist.

Lightning Talk – Rethinking Decentralization As Orthogonality by Peter Wang, 10min

Isolated from
Identity & Transport

(data paths)

Isolated from
Data & Identity

User controlled
Centralized &
Decentralized trust chains

Isolated from
Data & Transport

Pillars of Information Integrity


Resource Mapping and the Consent of the Governed
There is an obvious need to build scalable resilience mechanisms into our social safety net. We no longer have the time or luxury to ignore the current gaps in privacy, accessibility, and collaboration. Now is the time to develop systems and tools specifically for communities in need, to provision for privacy and universal inclusion, and to put forward effective strategies for localized problem solving and communications.

Maps are the answer to a critical question: “What is in my environment?” The purpose driven idea that has carried over into PLAN is to create a robust information visualization platform that fosters resiliency of shared habitats and local relationships through collaborative mapping and exploration. For that to be possible, we need to completely rethink systems so that collaboration as well as privacy is built in by design.

Resource mapping is indeed one of the high utility applications that can be harnessed with spatially collaborative systems; however, who owns that data and where does it reside? People are right to hold skepticism of a system that is designed to, let’s say, track all the money, resources, or the infirmed. There’s a really important human aspect at the heart of the matter that is rarely talked about or acknowledged. Unless a consent relationship to be part of a data sharing community (or any community for that matter) is fostered, and an agreement that governs the span of control in such a system is fully articulated and checked, then all we’ve really accomplished is creating additional moral hazard to navigate.

Community-Centric Technology
Community-centric technology means that systems are not only designed for collaboration, but owned, implemented, and managed at the local community and individual level. To facilitate this shift from developer-centric software, we’re working on bringing together a technology stack that is modular and pluggable, including the encryption AND storage layers, extensible core functionality, all the way to the interfaces that make the functionality accessible to end
— Read on www.plan-systems.org/2020/04/03/beyond-mere-decentralization/