All posts by djukon

The Penta Proximus Architecture

Here is a holistic computing architecture block diagram proposal (more for inspiration) to be used in a OSE’s GVCS (or similar) computing platform. It is a 32-bit version in a General Purpose Processing Unit (PU) → Video PU → Data Access PU → Sound PU → Input & Expansions PU configuration sequence.

The highlight is that the entire node can access up to 14GiB, using just 32-bit address buses (each component addresses up to 4GiB). The view is that the motherboard should not need to be replaced that frequently, and that high wiring density (i.e. using 64-bit address buses) may not be possible in the early days of local production.

I’ve been sitting on this draft for a couple months trying to figure out what could be significantly improved. I tried several other sequence combinations, and this one was what always surpassed the others (in my view; you are free to experiment). The other combinations that did surpass this one in some areas were the ones which had the DAPU adjacent to the GPPU. Those are more focused for data mining applications.

I also tried increasing the architecture to six (hexa) outer components, which didn’t resonate that well. Anyway, this is all intuition work, based on experience and imagination and I just wanted to get other people inspired who already have the chops to roll something like this out into the wild. Testing examples in the real world will give the best feedback, which is still far away from coming true. This would require redesigning and implementing most of the computing stack (such as the operating system).

Something I “discovered”, which is beyond the scope of this post, is to combine several nodes in a pattern for cluster computing. For that, a hexa-proximus architecture would fit like a glove.  Maybe even in a geodesic dome, who knows.

Here’s the source file you can download and edit: penta-proximus-alpha.odg (Open/Libre Office Draw).

General Computing Task Patterns

In the early days of home computing, computer architecture designers did not have a crystal ball to peer into what would be the most common tasks a computer user would engage in for the next thirty or more years. We still don’t have that crystal ball, but we can postulate on what could have been, if we had known.

Some innovative architectures, such as the Commodore Amiga (initial A1000/”Lorraine” model developed from 1982 to 1985), went as far as to put graphics and sound in dedicated parallel custom chips, releasing the load from the CPU. Yet today, we are aware that graphics and sound are not the only priority in common tasks; networking and mass storage are sometimes a bottleneck to be tackled with. We can of course forgive the lack of foresight from 1980s designers, as of then, the Internet was not available for regular consumers, and cheap mass storage solutions were still a mirage.

Lets stop before I continue on a Captain Hindsight rant fest. Before proposing any holistically revised kind of architecture, I’m going to first recognize in this post, the patterns of general computing tasks that, we as humans, execute frequently, from my “default” perspective.

The form factor for general computing nodes is not that important for this exercise. Let us take rack mounted servers to smart phones, with tablets, laptops, desktops and expandable gaming consoles in the mix, anything goes. We’re trying to figure out what is common in them for current day usage. So my current categorization of general computing tasks is as follows:

  • Gaming/Simulations;
  • Media Content Consumption and Production;
  • Communications (computing node inter-connectivity on a planetary packet network);
  • Office Productivity/Digital Assistance;
  • Archival and Data Mining.

The dividing model I chose to use is just that, a model. For instance, I could have chosen to give Scenario Prediction (Artificial Intelligence) a category of its own, but that would be having a different perspective on simulation and data mining.

Are there any other major general computing patterns that may emerge in the future? I am not sure. What I do realize is that new form factors, interfaces, sensors (and mixing those up) find new contexts in which to engage in different computing scenarios, such as what has happened recently with smart phones and tablets. The major patterns I point to in the example categorization, have not changed much since the late 90s. At least I could take this model back to the late 90s (~15 years ago) and it would make sense.

This post is a build up to a holistic computer architecture (transistor based) I’ve been theoretically playing around with. Stay tuned.

Regenerative Computing: An Introduction

Having followed permaculture for some time now, I’ve been figuring out how to look at computing topics through a permaculture lens.

Permaculture, if you have not heard about it yet, is a way of looking at functionally designing the world around us to leave us with a permanent (i.e. sustainable) culture in which to live in. Instead of wasting resources in the name of profit. Well, that’s how I define it, every permaculture specialist has their own definition anyway. If you take a look at permaculture design courses, most of it deals with landscape design and food production systems, though it can be extended to any field of human endeavor. Just take a search engine, put in “permaculture” and shop around for definitions until you are satisfied.

Computing, which I will focus on in this blog, is an important component for humanity’s next generation social economic paradigm. I.e. to post-oil, post-industrial or post-scarcity; relative terms to what we are predominantly living today (2014). If this is also the first time you’ve come into contact with social-economic transition topics, I suggest taking some time to watch at least two documentaries: “Zeitgeist: Moving Forward” (2011) and “In Transition 2.0” (2012). Solutions-wise, I personally tend towards the bottom-up, community-driven, network approach of the Transition Network, than what the Zeitgeist Movement sometimes portrays about transition.

If you don’t want to spend a few hours on these documentaries, just reflect on previous civilization transitions and the base technologies that allowed the next generation paradigms to emerge. From nomadic to agricultural society, writing and agriculture were the main “tools” that allowed the sedentary (scarcity) society to take shape around the world. That first transition took three thousand years to spread. The following transition was to the industrial society roughly 200 years ago, with the advent of the steam engine and Gutenberg press technology. That transition took around eighty years to spread around the globe. Today we are at the cusp of a new transition, mainly thanks to the technical digital revolution, which brought us information automation and global instant peer-to-peer communications. Interesting decades lie ahead.

What I argue is that we must be able to produce/source computing tools locally, before the transition to whatever comes next is through. Currently (2014), we are still sourcing electronic components using a global market, which is very fragile as it depends on the illusion of infinite growth, based on just one metric, profit. As we can figure out quite easily, infinite growth on a finite resource system (i.e. the planet) is impossible to sustain; there is a time window in which you can illude yourself in that paradigm.

Coming back to permaculture, I must say that I do not agree with the use of the term “permanent”, because the Solar system (and the Universe, for that matter), will eventually cease to exist. So “permanent” is not something I am very fond of using. My alternative term is “regenerative”, which I combined with computing, results in “regenacomputing”. The principles used from permaculture are not changed or enhanced, I just want to be more precise with the use of words.

I am also a retrocomputing enthusiast, as I find many interesting designs patterns in old, abandoned technology. The Commodore Amiga is one such platforms, which I am basing a new regenerative general computing design on. More on that in following posts.

I have been extremely inspired by the Open Source Ecology project, which is in its early stage of development. The purpose is to design 50+ small scale industrial grade machines to produce modern day comfortable appliances, locally. Once that technology tree reaches its completion, self produced electronic and digital technology will come into play. That is what I intend on contributing to the world with this blog, and any side projects that may come out of it. Much like what the Wikispeed project is doing for automotive design,
but I intend on giving the equivalent for computing.

Personally, I have never designed hardware in practice, just some theory in university courses. That may benefit me, as I am not locked into any traditional way of thinking about the process. My skills focus more on software development and (broad) pattern recognition, but I’m open to any task and acquiring new skills. As Robert A. Heinlein put it, “Specialization is for insects”. I’m really in the beginning of this whole process and I hope you enjoy reading this blog as much as I will enjoy writing it.