Namibia’s strange internet domain fiasco.

Namibia has found itself in a strange situation. Our country’s top domain is owned by a private Namibian entity. This is a problem. Although it is not uncommon for private entities to be in administrative control of issuing a certain countries domain name, these entities usually operate within mandates set by a national communication regulatory body or by government itself. But not in Namibia boeta, you just have Ondis. Chessssss!

This private entity comprised of three main individual shareholders called Ondis, in the early 90s saw that the Namibian government and private sector were sleeping on the the internet so they went and registered the .na domain with ICANN and now subsequently own the right to solely issue .na and .com.na domain names.  The company has issued all Namibian domain names through its instruments including those of government for close to two decades now at an average rate of 100USD/year for .com.na domains and 500USD for .na top domain. With close to 3000 Namibian domain registrations to date, these guys must have made a pretty penny. All in all, 3000 domain name registrations to a population of 2.3 million people is not very exciting anyway. I daresay the fact that three guys are figuratively holding the whole country at ransom has something to do with that.

infona
Super high domain registration costs for Namibian locals.

The cost of Namibian domain names is prohibitive, many Namibian’s end up purchasing .com and other domains which can be purchased from as little as 5 USD ( NAD 56) from services such as GoDaddy.com and Namecheap.com. Dr. Ebehard Lisse, one of the core shareholders of Ondis, has in the past defended prices saying the high cost was due to size of Namibia’s economy and that you would find similar price schemes with similar countries.  A little research shows that this is simply not true.

Southern African Small Country Domain Prices
Figures from each respective country’s domain registrar found by internet search.

Globally, domain registration prices for any country average around 10-20USD per year whereas Namibia’s are well over that range as the figure above shows. The Namibian ICT sector has had to make due with this problem but by no means has kept quiet on the issue.  Since 2008 The Namibian ICT Alliance has in the past requested Ondis to have a more inclusive board so that stakeholder in the various ICT sectors could have better representation. Ondis has not yet ratified the request and has accused the ICT alliance of exerting political pressure instead of engaging with them. Frans Ndoroma MD of Telecom Namibia has also called for a multi-stakeholder body to be implemented to take control of the domain registration license.

I personally see this as a failure of both government, private sector and just what I can call nothing but greed and short sightedness by Ondis. Government and private sector should have exerted far more pressure to standardise the process and now that the internet permeates nearly every part of daily operation of most public and private entities, easy and cost effective domain registration is becoming a bottleneck to proper representation of those entities on the internet today. Conversely, Ondis should have initiated steps to transfer control of .na domain registrations to a publicly accountable organisation years ago. They have cited lack of expertise in domain management and Namibia’s small population as cause of the slow uptake of Namibian domain names but that is just ludicrous speaking as a private entity sitting outside of public scrutiny. How could they even hope to address those same concerns if they do not have a relationship with  civil bodies in government? Their  holding on to the ccTLD licence with such fervour, leads one to assume their motive is purely financial, whatever the case may be.

Lastly, where is ICANN in all this? In 2007 at the Rio Internet Governance Forum they apparently had promised Mnr. Netumbo Nandi-Ndaitwah then Min. of Info. & Broadcast that the ccTLD licence would be transferred over to government. You know what they say about promises. They need to be held to account, whether or not such a promise was made. That they continually allow this situation to pervade by ratifying Ondis ownership of the Namibian ccTLD goes against their own tenets of accessibility and accountability.

There does seem to be a growing amount of talk about the country  on this very issue and hopefully the newly established CRAN and ICT ministry will spearhead a task-force to bring all concerned stakeholders together to sort this issue out. With the unveiling of the new domestic IXP, making sure that national internet domain assignments is a transparent and optimised process for the challenges we face ahead in the rapidly changing technological landscape is key

Wikipedia and Indigenous Knowledge Systems

You must expect that from time to time this blog will concern itself with research matters around information systems and issues about their appropriation or adoption in indigenous communities. This is because part of our social development agenda is to create tools that aid indigenous communities. That being said I would like to, albeit at a very high level, deconstruct the implications of participatory computing systems like Wikipedia and the role they play in empowering Namibian communities or the communities of other countries like it. This article is a preamble to a more comprehensive report that I’m working on during the course of the year.

Once A Nomad

With the recent advancements in ICT4D, Namibia has seen many of its indigenous communities receive huge investments
in telecommunication infrastructure. There are many reports that document the progress of this endeavor and I suspect they form part of a greater discourse about the proverbial “bridging the digital divide”. My concern however is not whether rural schools are getting educational necessities like internet but rather the socio-technical issues that come with introducing “foreign technologies” into indigenous communities.

I recently got dragged into the maelstrom of Wikipedia and what it means for indigenous knowledge systems. I’m going to ignore any academic citation red tape right now and tell you that indigenous knowledge is popularly defined as “knowledge acquired by people who have had a long rapport with their environment”. The Himba of Namibia for instance would typically qualify as possessors of indigenous knowledge since their livelihood over the years has relied greatly on knowledge they acquired from living in Southern African environments for a long time.

Jimmy Wales, one of the founders of Wikipedia has described Wikipedia’s grand vision as “creating a world where every person on the planet is given free access to the sum of all human knowledge”. I’d like to point out that “sum of all human knowledge” is really where it gets tricky. Currently the regulations that control the commission and omission of information into Wikipedia are laden with what we call a systemic bias. This systemic bias is preventing us from aggregating the sum of human knowledge because generally the curators running the show stem from western origins bringing with them western paradigms. This is being promulgated by a few counter intuitive rules they essentially say everyone is allowed to say their say as long as they do it in erudite English.  One rule (notability) for instance requires that any information contributed to Wikipedia is anchored by reliable sources. The problem is reliable sources is defined from an occidental point of view.

To put things in perspective, this essentially means that if you as a Himba wanted to submit an article to Wikipedia documenting a unique customary tradition this article would have to be substantiated by enough notable sources for it to survive Wikipedia’s unforgiving curators. Now, finding reliable source might not be a problem when writing about a particular butterfly in North America since Zoologist or historians have documented the landscape to near exhaustive limits but this is not the case for Namibia. A lot of Namibia’s history or indigenous knowledge is undocumented and what little has been written about it has been written from the view-point of Western settler intelligentsia that introduce a serious narrative bias.
Wikibias?
The entire thing is a Penrose step of never-ending issues, not only socio-technical but sometimes behavioural and cultural as well. With strong cultural underpinnings, we see local value systems clashing violently with those embedded in imported technologies. Perhaps I’m being too idealistic but when we leave this planet for the stars one day I’d like to leave with the wealth of its knowledge on a memory stick and I’m not just talking about knowledge on my favorite composer Frederic Chopin, but also how my ancestors made my favorite traditional drink Oshikundu. Currently, many research groups are experimenting with meta tools that make it easy for potential would-be editors to become frequent contributors. The declining retention rate of editors on English Wikipedia doesn’t help the faint glimmer of hope to encourage contribution to the sum of human knowledge. One would think that to overcome local challenges to the meagre repositories of Indigenous Knowledge we have to wait for a top-down solution but if M-PESA is anything to go by maybe we ought to find local solutions by co-opting a Western technology.

(Note the irony in all the wiki links in this post 😉 )

Nanotechnology: The Future of Faster Electronics

srg-iii-pov-animation2
With 15,342 atoms, this parallel-shaft speed reducer gear is one of the largest nanomechanical devices ever modeled in atomic detail.

Nanotechnology refers to an area of science that involves the manipulation of matter on an atomic or molecular scale, generally accepted to be in the 1 to 100 nanometer range in at least one dimension. It involves the creation of chemicals, materials and even functioning mechanical devices at an extremely small scale.

So just how small are these nano objects we’re talking about, I hear you ask? Well since you asked so nicely, I’ll attempt to explain.  For starters – just to give you an idea: 1 nanometer (nm) is about one billionth of a metre. A human hair is about 80,000 – 100,000 nm thick. Using the gift of imagination, let’s shrink ourselves down to the nano scale (effectively making us The Nano-Tech Guys? Has a nice ring to it, don’t you think?).  Now that we’re tiny, let’s adjust the scale of things and take a look at a few familiar objects, to bring things into a more familiar perspective.

If we took 1 nm as representative of 1 metre, then mini me would be 1.7 nm tall. Our previously mentioned human hair would be 80,000 metres thick. That’s 80 kilometres. A sheet of paper would be 100 km thick. So if mini me stood next to a sheet of paper, it would be like big me standing next to something 100 km high. An object the thickness of a coin would be high enough to bump into some low earth orbit satellites. I hope this helps put things into perspective. Anyway, moving on…

I think it’s fair to say nanotech is still in its infancy, owing to the obvious difficulties in manipulating matter on such a small scale. At this tiny scale, many of the materials we’re used to dealing with have very different and very interesting properties, opening up a range of applications and possibilities. Ever the inquisitive one, I hear you ask again – “What can we do with these tiny items?” Well actually, you may have already used devices and products that incorporate nanotechnology. A few examples are:

 Sunscreen
Yes, sunscreen. Your transparent sunscreen most likely has nano-particles of titanium dioxide and zinc oxide that absorb harmful UV rays.

 Clothing
Nanoparticles are increasingly being added to clothing to offer UV protection, antibacterial action via silver nanoparticles, or nano silica particles for waterproofing. Expect future developments to merge nanotubes and nano fibres into “smart” clothing that can respond to your body, or your immediate environment.

 Computers / Smartphones / Tablets
Yup, those too. The super-fast processors that run your PC, smartphone, etc are manufactured using ultra-small semiconductor components that can be as little as 22 nm across nowadays.

Graphen-640x512
Graphene is an atomic-scale honeycomb lattice made of carbon atoms.

 

One substance that seems to be causing plenty of excitement in the nanotech world is graphene.
Graphene is simply our old friend carbon – the same stuff that gives us charcoal, pencil lead, and the black stuff you have to scrub off the bottom of the pot when you get carried away playing games and you burn your dinner. Carbon atoms can be arranged in a variety of ways, with very different results. Depending on the configuration of the atoms you can get hard diamond, or soft pencil lead, to name only 2.  Graphene is a hexagonal, 2-dimensional sheet arrangement of carbon atoms, and is only one atom thick. This substance has incredible properties, particularly excellent electrical conductivity, which makes it perfect for manufacturing computer chips. Graphene nanoribbons could be capable of transporting electrons thousands of times faster than a traditional metallic conductor, resulting in fast processors and solid state storage technology that would be a gamer’s dream.

The medical applications of nanotech are shaping up quite well too, with biotelemetry implants the size of a grain of rice that can remain powered (with a graphene technology battery) for up to a month. In the medical field, nanotech also allows for effective drug delivery mechanisms. A nanostructured composition encapsulating a protein called interleukin-2 (IL -2), which is lethal to cancer cells, helps fight cancer more effectively while minimizing the side effects of high dosages of the “naked” IL-2 protein.

Nanotech has the potential to revolutionize a large number of industries. If we can develop better techniques for manipulating matter at this scale, we can expect a myriad of amazing new applications to crop.

 Right, so I’m off to play Crysis, where nanosuits and other cool hi-tech stuff abound. Let’s hope I don’t burn my dinner.

 

Google takes mobile customisation into overdrive.

 

One thing is clear, there is no shortage of innovation at Google. The data giant isn’t satisfied with  global domination of the smartphone market with it’s Android operating system, now they want to standardise and modularize the hardware aspect of smartphones too.

Enter a fully modular and endlessly customisable smartphone, the Ara. The Ara is basically just an exoskeleton frame which allows you to plug in different ‘modules’ which provide different functionality such as the screen, sound, the antenna, battery etc. These modules can be designed and built by ANYONE using the open source platform Google is providing for hardware and software developers. Google is planning to implement a Play Store type regiment to bring the modules to consumers and to enforce some kind of quality control I would assume. Even the modules themselves will be highly customisable, allowing the user to remove and swap the casing for further personalisation.

Project Ara
An Ara mobile disassembled

 

A modular mobile phone scheme allows for longer device lifespan as you won’t throw away your whole device if just the screen or battery are malfunctioning, you’ll simply replace the modules and go on with your life. The modular phone concept is not new. You might remember Phonebloks, a modular phone Kickstarter project from las year. This project is now being developed in collaboration with Project Ara.

Google says Project Ara is in line with its aims to reach 6 billion smartphone users. That number probably has you thinking “Google, you’re reaching.” but then again when have they ever not been? This is one of their ‘moonshot’ initiatives which include their self driving car and the global internet coverage balloon network project, Loon. Speaking at the recent LAUNCH conference in San Francisco, project head Paul Eremenko stated that they are aiming for a 50 USD entry level unit when the phone finally comes to market early 2015. That is quite simply mind-blowing. It is also highly disruptive if it actually takes off and gains traction.

If that does happen, we will see a whole new ecosystem for exciting new startups to emerge. One could easily imagine medical and scientific modules that could be developed which would totally redefine what a mobile smartphone device is.

 

E-waste is a serious problem in Africa. A growing portion of the e-waste pie are mobile devices. Countries such as Nigeria, Benin and Ghana are being used as dumping grounds for obsolete electronic devices from all around the world. These gadgets which are so instrumental to our daily lives are comprised of components such as the processor, display, antenna etc.. which when put together, make a mobile device.

ewaste

When there is a defect in the device, it is usually just a certain piece of hardware that needs replacing but the cost of repair or the ability to repair that certain chip, LCD screen or other malfunctioning feature is prohibitive for most people so they end up throwing their devices away. These end up in huge toxic landfills and the materials these devices are made of take thousands of years to decay. E-waste is a complex problem with many of the stakeholders in the global electronics markets needing to take steps towards more sustainable methods of manufacturing. Google’s Project Ara which is a definite step into that direction.