All posts by dashinte

The Future, between fiction and science-fiction

This article is an attempt to shed some light into the future of mankind, Artificial Intelligence (AI) and the possible coexistence between us and the machines. I said some light, meaning that this are my opinion and not some heavy stuff coming out of the Holy Bible, Quran, Ramayana or Kama Sutra!

I am not Nostradamus, being born in Transylvania I am more of a vampire than a seer so, my visions of the future are not very peachy, more on the pessimistic-Armageddon like scenarios.

History will tell us that we, the human beings never learn from our experience or our mistakes. We started wars with rocks and sticks, continued with arrows and swords, even better with guns, bombs, A-bombs, chemical weapons, WMDs, biological weapons, you name it and it is not over! It seems that violence and war are built-in functions into our brains, we always, endlessly talk and promote peace, but reality is a three-letter word: WAR!

Since the Golems and Frankenstein, we had/have a wired fascination with mechanical entities able to “think”. The Science-Fiction Universe is full of amazingly advanced robots, some of them friendly like 3CPO and R2D2, others evil like the Terminators. The question raised by many scientists, writers and philosophers is what will make the friendly robots to continue to be friendly after realizing that they are better than us in every domain? And yes, The Three Laws of Robotics created by Isaac Asimov are pure fiction and will not protect us! A wise man, Stephen Hawkins once said: “Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last, unless we learn how to avoid the risks.”

Nowadays, on top of the” conventional” wars, we are faced with the cyber-wars, hacking, stealing billions of dollars electronically, fake news, manipulating elections and the subtle and not directly visible war called social media platforms! All this is possible because of the amazing development of IT and AI and yes, there are human beings controlling all the computers behind the cybernetic wars for the time being, only that the tendency to automate and make more super-computers autonomous is very strong! One example is the quantum computer, apparently already built by several universities/governments and ready to be launched. Oh, those mighty qubits how much they will change our world!

The faith, religion and God will be the subject of another article. Considering the controversial state of mankind today I will quote from Kurt Vonnegut: “If God were alive today, he’d be an atheist!”

There are many predictions and visions about the future of mankind, made by serious scientists, Sci-Fi writers, gypsy fortune tellers and religious nuts but the four main scenarios are:

The Happy Mankind living in harmony with The Machines

The world of this scenario is run by super AI machines very friendly towards humans. Yes, something similar with Asimov’s Three Laws of Robotics is preventing the machines from harming humans and from trying to take over the Earth.

The AI machines are taking care of everything, from producing food, manufacturing all the needed goods, administration, transportation, housing, health, science, environment and education. Developing non-polluting transportation means, eliminating the pollution from almost all the industries, the nature will be reborn.

Humans, well humans will be happy and very busy creating art, writing famous books, developing extraordinary scientific theories. Well, some of them anyhow. Guided by the geniuses of the mankind, the AI machines will develop cures for all the diseases so people will live happily for hundreds of years.

When the issue of overpopulation arrived, the intergalactic flights were invented and the Faster Than Light Speed (FTLS) ships were created. The FTLS intergalactic ships will deliver humans to Earth-like planets across the Galaxies, planets terraformed beforehand by the …AI machines!

Gradually, the Universe will be populated by humans or a new kind of human race evolved because of the harmonious interaction with the machines for thousands of years. Gradually, the body parts will be replaced by prosthetic bits, the internal organs as well, so the new humans will be cyborgs with huge brains and a robot-like body. After a while, unfortunately the organic brain will start do decay and eventually the humans will die, so at same stage the humans will decide to transfer their brain content to a cyber-brain. Now the mankind is quasi-immortal, but the fundamental question can be asked: Are this being part of mankind or not?

It is possible that the expansion of mankind across the Universe will contact other civilisations. The mankind will interact with some, will fight some and ignore some. It will not significantly change the mankind evolution simply because of the simple arithmetic, the billions and billions of humans across the Universe will not be matched by any other civilisation. In my humble opinion it will be a very boring Universe!

The Transformation of Mankind into vegetables/batteries for the AI Machines and beyond

It will be now virtual Matrix-like world. The planet will be covered with endless “farms” growing humans in a vegetative stage. 

The bodies will be feed via tubes with a nutritional liquid, while an array of electrodes and sensors will collect the energy created and sent to battery farms.

Some of the bodies will have “Ghost in the Shell” like dreams or nightmares. Realizing that the “dreamers” produce more energy than the inert ones, the AI farms admins will start inducing dreams into the brains of their “batteries”.

 In several hundred years all the bodies will “dream” similar fantasies developing a strange, supernatural kind of awareness. The AI supercomputer will have no knowledge about the new manifestation of “mankind”.

 In several millennia, planet Earth will be the host of two mega-entities: the AI supercomputer and the elusive Dreamkind.

The AI supercomputer will continue to evolve, becoming more and more intelligent, solving every scientific puzzle on Earth but never developing a spaceship. Why? Because AI lacks the very essence of mankind, the Soul. Without a soul there is no desire and aspiration to reach the stars, to become a space explorer.

The Dreamkind entity will evolve also; in several thousand years the new manifestation of mankind will be able to get free from the bonds of the “vegetative” bodies, an entity of pure energy, self-sustaining and able to move through space and time.

 The Dreamkind, although able and capable will not seek revenge against the AI machine because the Dreamkind inherited the collective soul of mankind and compassion and forgiveness are components of the super-soul. Instead the Dreamkind will decide to move forward, leave the Earth in the material sense and accept the invite to join the Universe Holy Spirit, also known in our time as God. If you ask me if I believe in God, I will paraphrase a famous philosopher who answered the question with: “Yes, sometimes during the night!”

  • The ‘Ideocracy’ of Mankind

Do you know how many policemen are needed to screw a light bulb into the celling socket? Five: one to jump on a table and hold the bulb in the socket and the other four to rotate the table!  It is just a joke; I have the utmost respect for the coppers and their intelligence.

Now, imagine planet Earth not in a very distant future, populated by billions of humans with an average IQ factor just above 20. It will be a world of morons and moroness! I know that the word moroness is made up but if we have a baron and a baroness why not a moron and a moroness! In medical terminology a moron has an IQ between 50 and 65 and it is labelled a person with mild intellectual disability, so with an IQ between 20 and 30 it will be a planet of super-morons!  

How can humanity end up like this? Several factors contributed like increased pollution, climate change and a stagnation of technology and AI development. Everything starts to degrade, manufacturing, transportation, health system, governments. In several hundred years, the technology is dying, there are no specialists to fix it. Education is virtually inexistent, almost all the schools are closed. The only thriving industry is porn cinematography and the results are disastrous. People will watch endless porn movies and have sex and have lots and lots of children. The children will be more and more stupid with each generation. After a while, the social infrastructure will disappear and anarchy, chaos takes over. All the technology will stop working, everything reverted to Stone Age like conditions. With no rules and no rulers, the only law is the survival of the strongest.

In just several hundreds of years, the mankind will become a primitive community, languages reduced to about two hundred words, no technology, it will be some tools and fire but nothing else. Involution of mankind is complete. This is in a nutshell, the theory of gaussian distribution model of mankind evolution and involution.

Far from being my favourite scenario, this one is unfortunately the most probable. Look around us, the infant phase of idiocracy is already upon us, the porn industry is booming, the most visited sites on internet are the porn sites, less and less people are reading books, everybody is on some social media platform with some stupid photos of their body parts, singers and football players are paid millions of dollars while the Nobel prise winners are getting peanuts!

The beginning of the above scenario is very well illustrated in the movie IDIOCRACY.

( https://en.wikipedia.org/wiki/Idiocracy)

  • Pollution, the Killer of Mankind

The rapid increase in pollution (industrial and domestic) will irreversibility degrade the global ecosystem by 2060 determining the end of civilisation.

A 3 Celsius degree increase in the average temperature by 2060 will generate lethal heat waves which will last over three weeks affecting 35% of the planet surface and about 55% of the global population.

The most affected ecosystems will be the Amazonian rainforest, Great Barrier Reef and the Arctic zone which will disappear very quickly.

A catastrophic effect will be the disappearance of several insect species which will affect dramatically the food chain, the huge increase of the food products price will make the food prohibitive to most of the population.

A massive crisis of drinking water will again affect most of the Earth population. Obviously, the highest level of pollution will occur near industrial estates, power plants and highways, exactly the areas where the poor population is living, so they will die first.

The bad news for the rich people of the planet are that they will be next, surviving a few months or a few years longer than the poor people, but when their food, medicine and energy storage will be dry, they will face extinction as well.

If you will ask, where is AI in this scenario the answer is: Which AI? Technology, IT and AI will disappear together with the meltdown of the society.

So, in this scenario, mankind alone is guilty for its own destruction and nothing will be able to save us.

Planet Earth was passing nearby another planet and Earth said: I am a bit worried, I have humans! The other planet replied: It is OK, it will pass!  

  In conclusion, I am not an optimist, so my stories do not have a happy ending. So, like the great charlatan Nostradamus said: “In the city of God, there will be a great thunder. Two brothers torn apart by Chaos while the fortress endures. The great leader will succumb. The third big war will begin when the big city is burning.”

DATA CENTRE TRANSFORMATION A CASE STUDY

dct01

The exponential evolution of IT hardware platforms, operating systems and applications has created a quasi-perpetual data centre transformation trend.

Every company, regardless the size is undergoing the data centre transformation process from hardware and applications refresh, to virtualization and cloud migration. If your company has its IT infrastructure migrated to cloud, the burden of hardware refresh/upgrade will be carried on by the cloud provider, your IT department will only look after the applications upgrades.

The case study refers to a typical medium size company who requested a Data Centre Transformation (DCT) project with migration to new data centres and an assessment of what kind of transformation will be suitable for their IT infrastructure at the new data centres.

The approach was a typical enterprise architecture one (loosely following the TOGAF mainframe).

We organized the first workshop with the client’s management, stakeholders and IT management and technical people.

We learned that the customer will take care of the business transformation and also of the data/applications migration and transformation. Our role was to perform the IT infrastructure transformation and migration.

The subsequent workshops were dedicated to understand the business needs, directions and growth for the near and distant future and also to get a similar understanding of the company data structure and growth, data protection and archiving needs, also the applications matrix, future requirements, virtualization cloud compatibility and mobility requirements.

The results of these workshops was that we learned that the growth of the business will be on average of 5% per annum, the data will grow on a rate of 10% per annum, a dedicated backup and archiving solution was required (we were able to provide a leveraged secure backup solution but it was not accepted), the applications will not be transformed prior or during migration, but virtualized after a successful migration project, and most important the customer required zero downtime for its Production environment during migration and this requirement was not negotiable!

The team tasked with the IT hardware platform migration consisted of a Project Manager (PM), an Enterprise/Solution Architect (SA), a Business Analyst (BA), a Network Engineer (NE), two Windows/VMware Engineers (WVE), a Storage SME (SE) and a Backup SME (BS).

We started with a Discovery phase, auditing the Prod and DR data centres, recording every equipment, compute, storage, network and security.

The results of the Discovery phase were presented in a report and a workshop with all the relevant stakeholders. The customer IT infrastructure consisted of two data centres, a Prod live one and a cold DR one. The Prod infrastructure had new servers with growth capacity, needed a new SAN and a refresh of the core switches and encryptors. The Prod environment was setup with redundancy in the Prod data centre, each server had an equivalent server on standby.

The cold DR data centre was outdated, with old servers and network equipment. The fail over process was fully manual and very slow, totally unacceptable for the business.

The solution for the customer DCT was the following:

  • Migrate the cold DR infrastructure from the existing cold DR data centre to a new DR data centre and transform it into a live DR data centre acquiring an active-active data centres scenario. The new active DR data centre will be fitted with new servers and network equipment (similar or newer to the Prod environment), the fail over process will be 90% automated and scripted with minimum manual input (only for the core applications requiring multi-level authentication).
  • In order to achieve the zero downtime requirement for the Prod data centre migration/transformation, a staged process was designed: First stage, a fail over of half of the Prod applications from the old Prod data centre to the new active DR datacentre and migrate the infrastructure belonging to those applications to the new Prod data centre. After migration another fail over process was performed, from the DR data centre to the new Prod data centre. Second stage, failover of the remaining applications from the old Prod data centre to the new DR data centre. Migrate the remaining infrastructure from the old Prod data centre to the new Prod data centre and perform another fail over from the DR data centre to the new Prod data centre.
  • The design was catering for the applications interdependency; the staged migration of the Prod data centre was based on moving dependent applications during the same stage.
  • The design provided a new solution for the Prod network infrastructure with new core switches and encryptors and also a redundant Internet connection sourced from three major Telcos.
  • The design also provided a new backup solution, with new backup servers and a dedicated backup network.
  • Although it was not part of the migration/transformation solution, the customer requested a solution design for a virtualization of their Prod and DR environments, new and more performant SAN and a high level design about a possible cloud migration, which we provided.

The migration/transformation of our customer IT infrastructure was a success, the main requirement of zero downtime during migration was achieved together with a streamlined Prod environment, ready for virtualization and cloud migration.

If you want to find out more about DCT projects and maybe want some help with your migration/transformation contact me: ilie.mihut@dashinternational.com.au

GREED AND STUPIDITY = EXTINCTION

 

apocalypse1

The title is a loose quote from an article by Stephen Hackings and for several years now, he is foreseeing the extinction of mankind caused by greed and stupidity.

Apparently there is no connection between greed and stupidity but actually they are closely interconnected.

Statistically, a few percentage of the human population owns ninety per cent of the Earth wealth. I am not assuming that those very rich people are greedy, it is just that the stats show that quoting from a Midnight Oil song: “The rich get richer and the poor get the picture” it is actually true.

Again, statistics show that “today, nearly 17% of the world’s adult population is still not literate; two thirds of them women, making gender equality even harder to achieve.

The scale of illiteracy among youth also represents an enormous challenge; an estimated 122 million youth globally are illiterate, of which young women represent 60.7%.

The 67.4 million children who are out of school are likely to encounter great difficulties in the future, as deficient or non-existent basic education is the root cause of illiteracy.

With some 775 million adults lacking minimum literacy skills, literacy for all thus remains elusive.” The quote is taken from: http://www.unesco.org/new/en/education/themes/education-building-blocks/literacy/resources/statistics

So, the very deep connection between greed and stupidity is revealed by the above statistics, I mean if greed will not help educating the hundreds of millions of human beings which are illiterate, stupidity will win and the human species is on a clear path to extinction.

The present day technology is able to mass produce extremely cheap PCs, laptops and smart phones and the super-IT-social media companies like Facebook, HP, Oracle can afford to provide free Internet on a global scale. This scenario, with almost free hardware and free Internet is crucial in preventing the planet of gradually being populated by stupider and stupider people. The IT can help in educating people, giving them the elementary tool for learning and avoiding the involution of the human species.

There are examples of laptops and mobile phones produced in India and sold for a few dollars apiece, there are examples like Facebook willing to provide free Internet in some parts of the world, so there is hope that the course towards stupidity and extinction can be avoided but it is not enough.

I know that it is not much we can do about greed ( L ) but the illiteracy problem can be solved at a planetary level via United Nations Organization and the rich Western nations. It is naïve to believe that tomorrow all the conflicts and wars will suddenly stop and the billions of dollars spent on wars will be used to fight illiteracy and stupidity but I strongly believe that if we do not try, our future is doomed.

There is an old joke about the world wars: A wise man was asked what weapons will be used in the third world was and his answer was: I do not know but the next war after that will be fought with rocks and sticks!

FROM FRANKENSTEIN TO NESTOR (NS-5) OR FROM THE MODERN PROMETHEUS TO ARTIFICIAL INTELLIGENCE (AI)

The creation of a sentient being, biological like Frankenstein or electronic-positronic like Nestor the NS-5 robot from the novel “I, Robot” by Isaac Asimov was and it is a fascinating subject.

The above examples are fictions, created by the imagination of some very talented writers.

Today, the Artificial Intelligence (AI) is a reality, not in a 100% sentient sense but is getting there. There is serious research done by the major IT companies, some governments and some military research institutes.

From a technology point of view, the AI devices (stationary or mobile) will be a blessing, all these inventions like smart cars, smart appliances, smart buildings and smart cities will be exponentially upgraded to super-smart/semi-sentient devices which theoretically will make our life better, safer, longer.

It is easy to imagine, in not so distant future, being permanently connected to the AI hub of your household and everything you need being delivered the way you want and like and at the moment you want because the AI robot will learn your needs and habits and seamlessly adapt to fulfill those needs. It is also easy to imagine, those AI units manufacturing everything for us and manufacturing other AI units sooner than later.

With the last sentence I just opened the Pandora box of AI challenge because the inevitable question will come into one’s mind: What is going to happened when the AI units will become more intelligent than us and, more, what is going to happen when the AI units will become fully sentient robots?

Well, in the SCI-FI literature and movies, we have the three laws of robotics created by Isaac Asimov which are:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

The problem with these “beautiful” and fail-safe for humanity laws is that they are not real and we do not have the technology/mechanism to implement them. There are many Doomsday scenarios regarding the future wars between us and the machines and in almost all these scenarios we lose. The reality is that there is an unseen/unadvertised race to create the pre-sentient AI and I am afraid that the fail-safe mechanism is neglected. Today, there are several prominent scientists trying to rise the alarm, including Stephen Hawkins and I am not sure that the hidden powers behind the AI race are willing to listen.

Now, let us assume that the AI sentient machines will not kill us, will help us and will be our friends and servants forever. This scenario is actually much scarier than the first one because it will gradually drive us towards a rapid involution of humanity. Just think that we will not need to work, drive, create, yes create, because the super-intelligent AI machines will do all of the above for us.

What will be left for the mankind? Watch TV, eat, drink (all these free!), have sex, procreate and start the idiocracy process. Our lifespan is getting longer, the planet will have more and more old people, but will be healthier old people and capable of work in a huge variety of domains. If the AI robots will replace the human workforce, not only the old people will not have a job, but the majority of the population will be in the same situation.

I believe that the process of protecting the mankind intelligence, creativity and imagination should start now. No, I do not have the answer how to proceed with this enormous task, but I know that right now, there are enough smart and talented people able to find a solution.

I hope that our grandchildren and their grandchildren will have an active life, using their brains at full capacity, creating and inventing things, writing novels and poems, travel to other worlds and galaxies and not becoming some vegetables with a pea size brain and huge sex drive.

INFORMATION AND COMMUNICATION TECHNOLOGY IS NOT A BIG PHILOSOPHY!

Philosophy-of-information

Sigmund Freud, the Austrian neurologist, most famous for loving his mother very much (J), has also identified three revolutions of the mankind:

  • The first was the Copernican revolution, when humanity understood, via astronomical observations, that Earth is not the center of the Solar System and we are not the Centre of the Universe.
  • The second was the Darwinian revolution, when humanity realized that we are descendants of other animal species, inferior to us, but not very different in many aspects. I should mention too, the Creationist anti-revolution and the alien origin of mankind myth.
  • The third was the Freudian revolution, when humanity discovered via Freud and his disciples, the unconscious motivation, and the damning realization that our minds are not transparent to us.

The philosopher Luciano Floridi came up with the concept of a new revolution the information-digital one and he claims that the mankind is at the beginning of this forth revolution. We are emerged more and more into the infosphere (the environment of information and communication surrounding us).

For the young generations, born in the digital era, the speed of information and communication evolution seems normal. And it is not only mobile phone, tablets and Virtual Reality (VR) it is also smart buildings and smart cars and new educational opportunities like MOOC (Massive Open Online Course) and social media platforms and many more.

The older generations, born “non-digital” are amazed and sometimes overwhelmed by this digital revolution, the transition from a non-computer era, no Internet, no mobile phones (to mention just a few) to today’s digital wonders is not an easy one. We, the older generation used to have printed books and we used to read novels and poetry, we used to enjoy theater plays and jazz concerts, we had real friends not virtual ones, we used to communicate directly with other human beings or write a letter, on a piece of paper!

The infosphere surrounding us is growing at an amazing speed and it is becoming more complex every moment and it is impossible to predict the long term effects of the infosphere on mankind, on human social behavior, on our society. Paradoxically, the social media platforms where you have thousands of virtual friends, where you can share information, and memories and photo albums and music, are drifting us away from direct human relationships, these social platforms are becoming more and more anti-social.

Do not get me wrong, I truly believe that mankind is engaged on a great journey toward a fantastic future, where new technologies will make our lives better, easier and we will have more time to enjoy what we like and enjoy our hobbies.

My concern is about the effects of the fourth revolution on the discoveries of the third one, the unconscious mind. Surrounded by a super-complex infosphere, being bombarded every second with torrents of digital information how is our unconscious mind going to react? Doctor Jekyll or mister Hyde, who is going to rule us in the end?

dr_jekyll_and_mr_hyde_speed_reading

 

But the scary fact will be when your laptop will tell you: “Cogito ergo sum!” And that will be the next revolution! Welcome to the Machine!

Contact me at: ilie.mihut@dashinternational.com.au

BIG DATA in small words

bigdata1024

 

BIG DATA in small words

Data digital flow

 

I have to admit that for me, the name Big Data sounds somehow childish. It is like you, a very intelligent and highly educated IT consultant were asking your six years old son: ‘Hey Bill, daddy is working with lots of unstructured, huge data sets and we need a name for it.’ Bill: ‘Aaaaaa…… Big Data?’ J

A simple definition for Big Data is: Very large sets of unstructured data, with sizes beyond the ability of commonly used program/software tools to manage, capture and process the data in a tolerable time frame in order to enable enhanced decision making, discovery and process optimization. The size of this data sets is constantly increasing, from a few terabytes at the beginning of this millennia, to many petabytes today and many exabytes tomorrow. [petabyte (PB) = 1015bytes,

exabyte (EB) = 1018bytes].

Gartner Inc., in 2001 (then META Group) has defined the 3Vs of Big Data (volume, velocity and variety) adding the forth V (veracity) later:

  • Volume – the amount of data
  • Velocity – in and out speed of data
  • Variety – the range of data types and sources
  • Veracity – the quality of the data

 

what-is-big-data

The next step, after acknowledging the inability of conventional software to process the Big Data, was to develop software/tools able to solve this problem. Seisint Inc. has developed a C++ based distributed file-sharing framework for data storage and query, followed in later years by MapReduce and Hadoop with more advanced and better approach to the Big Data processing.

In order to setup a Big Data processing environment one will need:

  • A serious number of host machines (nodes) organized in a special cluster. The nodes can be partitioned into racks.
  • A highly performant storage array of reasonable size
  • A software framework with three main components:
  1. The framework providing the computational resources (CPU, memory, etc.) needed for the applications execution. Hadoop is using YARN Infrastructure (Yet Another Resource Negotiator) for this task.
  2. The framework providing permanent, reliable and distributed storage. Hadoop is using the HDFS Federation (Hadoop Distributed File System), Amazon is using S3 (Simple Storage Solution).
  3. The MapReduce framework which is the software layer implementing the MapReduce paradigm. In layman’s terms, the MapReduce was designed to take big data and use parallel distributed computing to turn big data into regular-sized data, by mapping the data and reducing the data.

For more details about the MapReduce paradigm read the article:

(http://www.dummies.com/how-to/content/the-mapreduce-programming-paradigm.html)

The evolution of Big Data processing ecosystems has triggered the apparition of other non-conventional technics like the NoSQL technologies. A NoSQL “non SQL” or “non-relational” database provides a mechanism for storage and retrieval of data which is modeled in means other than the tabular relations used in relational databases. (Wikipedia).

Another interesting development was the detachment of Apache Spark from being a component of Hadoop to a fast and general engine for large-scale data processing. Apache Spark can run in standalone cluster mode, on EC2, on Hadoop YARN, or on Apache Mesos.

With the very fast evolution of the Internet of Things (IoT), the variety V of the 3Vs of the Big Data is amazing. The sources can be any smart device, smart cars, smart cities, satellites, traffic cameras, surveillance cameras, ATMs, etc., the data is collected in every know format and in several new formats every day and that points to the real challenge with Big Data which is the first V (volume) and the growth rate is incredible!

I am, generally speaking an optimist and I believe that the future will bring us fantastic ways for processing the Super-Big Data of the future which can only be described that is “as big as China”! J

For details about this topic, contact me at: ilie.mihut@dashinternational.com.au

 

THE ANATOMY OF AN IT INFRASTRUCTURE DISCOVERY

ediscovery

“The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge.” Daniel J. Boorstin

The business process discovery and the IT process discovery are far less famous that the most important discoveries of the mankind, but I want to highlight that the quote above has the last part which applies perfectly to the business and IT processes discovery and that is the illusion of knowledge!

The IT infrastructure discovery is one out of many IT discoveries, which includes: IT processes, applications, data and infrastructure.

Paradoxically, the IT infrastructure is the Cinderella of the IT domain and at the same time the only “real”, tangible part of it, because regardless of where your infrastructure resides (in a data center or a cloud), ultimately is made of physical servers, storage arrays, switches and routers.

The infrastructure discovery is usually done together with data and applications discovery; sometimes it is a stand-alone process if, for example, the infrastructure is obsolete and a refresh process is needed.

Experience will tell you that a lot of IT departments do not have an up to date infrastructure inventory database, so the best practice is to acknowledge the customer asset database and to run the discovery process from scratch.

Nowadays, there are many infrastructure assets audit applications, lots of them with agentless deployment, scanning the infrastructure, servers, virtual machines, storage arrays, switches, routers, load balancers, etc. via almost all the IT protocols available (network, storage, computer, etc.). One has to input the IP addresses range and the admin credentials for the scanned domains and the audit application will collect all the available information for every asset. It is also recommended to run the software audit on the users’ subnets, getting the audit data for the PCs, laptops printers, scanners and mobile devices as well.

Again, experience will tell you that the software audit results will cover 90-95% of the assets, so a physical audit is highly recommended. With the help of several SMEs, the physical audit will cover every asset in every cabinet, recording the information for each asset form the front and the back of the cabinet, with asset name, serial number, network and fiber connections, etc.

The combined results of the software audit and the physical audit will be used to create an artefact called the discovery report which will have a detailed diagram of each cabinet, front and rear, tables with assets, servers, storage arrays, network equipment. The tables will have details about the make, model, serial number, name, compute power for each asset, also the manufacture year for each asset. If a proper naming convention was used, the names of the assets will show information regarding the operating system, applications running of that server and if the server belongs to the Prod, Dev, Test or DR environments.

One will be surprised to learn that sometimes a Prod server is actually a desktop PC under somebody’s desk, built many years ago as a test machine and forgotten by everybody!

The discovery report will present a clear and accurate view of the present state of the infrastructure without any recommendations regarding the future state.  The infrastructure discovery report will play a very important role for the next processes, the data discovery and the applications discovery which will be presented in a future posting.

Virtualization and cloud means the end of infrastructure discovery process?

No, because ultimately, the virtual machines will run on physical servers and physical storage arrays and this is also true for the virtual machines running in cloud. Only if you are running your applications on a multi-tenant cloud, the physical audit will not concern you per se, but it will be the cloud provider job, because is nothing worse that providing a cloud solution which is running on old hardware!

For details about the infrastructure discovery process contact me:

Ilie.mihut@dashinternational.com.au

BACKUP IS DEAD, OR IS IT?

 

remote-offsite-backup-recovery

No, I will not pretend that the backup is dead! It is just becoming something else.

The old days:

Traditionally, a backup solution involved some kind of media (tapes, disks) where you copy the important Prod data and send the media for safe-keeping in an off-site vault. The restore process, which is the reverse of the backup process, meant recalling a tape or a set of tapes, and restore the data on the original place or an alternate restore point. Restoring large amount of data was not a quick procedure, several hours if you were lucky. It is one of the reasons applications owners and DBAs were keeping several sets of Prod data on the Prod storage, just in case!

Present days:

The amount of data grew exponentially in the last five years, and, thankfully the technology has advanced in a similar fashion. The backup concept is gradually replaced with the total data protection and management concept meaning that the data will be analyzed once and it will be determined which data to backup, which to archive and what amount of indexing is needed for search purposes. The majority of the storage arrays have snap shot capabilities and with an intelligent backup solution integrating and controlling the snap shot process, the backup and restore are much faster and require less space on the arrays. The synthetic backups and “incremental forever” method together with deduplication and global deduplication, are massively reducing the size of the backups as well.

On top of this, the backup to the cloud as a service is offered by all major cloud providers, although if your backup is a few hundred terabytes in size, the bandwidth between the Prod site and the cloud can be prohibitive.

Based on the above facts, an enterprise backup solution (when properly implemented) is a robust and very dynamic data protection and management solution, with very fast backup and restore Server Level Agreements (SLAs), with load based variable backup windows and eliminating the duplicated data at an enterprise/global level, with an archiving and indexing component smartly allowing the archiving of legal documents and messages for the 7-10 years’ period, etc. because remember, backup is for operational recovery not long term retention of data!

900px-the_sounds_of_earth_record_cover_-_gpn-2000-001978_0

Future days:

The logical step in data protection will be the Data Protection Architecture concept allowing the data centers do deliver data protection as a service with the flexibility to allow different modules to be plugged in at the appropriate time to address specific user or application requirements. With a data protection architecture there will be no lock to a single application or storage device, but a mixture of various data protection tools from stand-alone software, to storage systems and dedicated backup appliances to be used as a service and managed as a single entity.

A Data Protection Architecture solution should be flexible, with data protection utilities being able to write direct to the storage, should keep the data in its native format and have a governing body to make sure all the required components work together to simplify management. The goal will be to eliminate the user interface from the data protection tools and applications and let them embed into the central data protection architecture framework.

As stated at the beginning of this article, backup is not dead and will not be for a long time, it is just that backup as we know it is evolving into a data protection architecture.

 

A HIGH LEVEL INTRODUCTION TO THE DARK SIDE…. OF THE INTERNET

 

darth-vader

As law abiding Internet users, we generally stay on the “legal-official” side of the Internet (ClearNet). But, I always believed that knowledge is power, so, knowing a bit about the Dark Side of the Internet is going to benefit you because you will be aware of the traps and dangers and you will be better prepared to avoid them. Ultimately, Darth Vader is just Yoda with a black helmet and a different funny voice!

So, we are aware of the Visible Net or the Surface Net or ClearNet and we are using it every day:

  • Search engine like Google, Bing, Yahoo, Baidu, Dogpile, HotBot, Metacrawler, etc.
  • Social media platforms like Facebook, Twitter, LinkedIn, Google+, You Tube, Blab, hi5, Friendster, Meerkat, MyLife, Periscope, Plaxo, Xing, Flickr, iTunes, MySpace, Vimeo, Instagram, Pinterest, Reddit, Scribd, SlideShare, Wikipedia, etc.
  • Email services like Gmail, Outlook, Yahoo Mail, AOL Mail, Zoho Mail, Mail.com, Yandex Mail, Inbox.com, etc.

It is time to introduce the Darknet, the Dark Web and the Deep web (definitions from Wikipedia):

  • Darknet is an overlay network, only accessible with specific authorization, configurations and software, generally using non-standard communication protocols and ports.
  • Dark web is the content on the Darknet and the overlay networks using the public Internet but require specific authorization, configurations and software for access.
  • Deep Web is a part of the World Wide Web with non-indexed content by the search engines.

TORI2p6

Figure 1 Darknet access and components

 

The above diagram is a high level presentation of the Darknet components and some of the access ways.

There are several programs used to get to the Darknet. I will mention two of them: I2p and Tor.

  • I2P or ‘The Invisible Internet Project’ is an anonymous peer-to-peer network. It allows users to send data between computers running I2P with end-to-end encryption using unidirectional tunnels and layered encryption. Because the limited number of out proxies to the Internet, I2p is best for peer-to-peer file sharing.
  • Tor or ‘The Onion Router’ is an anonymous internet proxy directing traffic through a worldwide volunteer network of thousands of relays. Tor wraps messages in encrypted layers and sends them through a bi-directional circuit of relays through the Tor network. Tor also provides a central directory to manage the view of the network. Because of the issue with the trust of exit nodes, Tor is best for anonymous out proxing to the Internet.

The Darknet Market Places and the Darknet/Clearnet Market Places (several mentioned in the diagram above) are web sites where illicit activities are taking place, trading buying and selling any type of goods and digital items paid with bitcoins or other kind of cryptocurrencies: drugs, guns, information, child pornography, assassins, malware, ransomware, DDoS, security, anti-security code, access to government sites, LinkedIn accounts and passwords, etc. The existence of many of these sites is ephemeral, not because the government agencies are taking them down, but because of competition and the fight between various groups of “dealers”.

It is common sense that accessing the Darknet Market Places is a dangerous thing to do, but one is free to choose this path at his/her own risk.

If one really wants to find this places can start with Reddit, DeepDotWeb, TheHiddenWiki.org or DNstats.net and look for lists of hidden services or .onions.

There are several Darknet search engines. Two popular ones are Torch (http://xmh57jrzrnw6insl.onion/) and Grams (http://grams7enufi7jmdl.onion/) and they will perform Google like functions on the Darknet.

As mentioned at the beginning of this article, knowledge is power, it is good to know about the Darknet in order to be able to protect yourself, but please do not be seduced by the Dark Side!

P.S. Below are links to a fantastic blog presenting the Darknet (I used information from their blog in the article) and three Darknet related articles for a more detailed view if interested.

(https://blog.radware.com)

iceberg-darkweb-darknet

(http://fossbytes.com/welcome-to-the-darknet-the-underground-for-the-underground/)

deep-web-iceberg

(https://privacyliving.com/2016/02/12/darknet-dark-web-the-tor-browser/)

dark01

(http://www.golkondanews.com/the-evil-darknet/)

 

THE INFRASTRUCTURE ARCHITECTURE CHALLENGE

infrastructure

 

 

Very often, the real life scenario for a business transformation starts with the infrastructure architecture because the management (CFO, CIO, CFO) had already chosen the future business model and the Infrastructure Architect had to come up with the best fit to match the management “model”. But, wait, it gets worst because in real life scenarios the CIO or the IT Director will have an Infrastructure architecture “solution”, and you, the Infrastructure Architect have to make it happened!

It is a compromise, you have to come up with the best possible design, to match the management requirements and to stay on budget, the only possible variation from this solution is when you find some technical details which can be a show stopper like hardware incompatibility or applications incompatibility with specific platforms, etc.

Now, let’s set aside the frustration created by the above mentioned issues and concentrate on the Infrastructure Architecture. It is also known as the Technical Architecture and basically deals with the structure and behavior of the technology infrastructure and covers the client and server nodes of the hardware configuration, the infrastructure applications that run on them, the infrastructure services they offer to applications, the protocols and networks that connect applications and nodes.

Infrastructure architecture model

Figure 1 Infrastructure Architecture Model

 

The above figure presents a possible Infrastructure Architecture Model where you have the building block of the Infrastructure Architecture (Business Functions, Architecture, Technology, Organization, Financial and Training) presented in three phases:

Current Environment (PMO) – where the blocks are randomly distributed.

Transition Environment (TMO) – where the blocks are starting to get in order.

Target Environment (FMO) – where the blocks are organized in the best possible working solution.

I have chosen the above model and the multi colored blocks because it should be a constant reminder for all architects that the Target Environment you have reached today is the Current Environment of tomorrow and the building blocks will gradually become randomly distributed again!  As the below quote mentioned: “The future is here, it’s just not widely distributed yet!”

 

arch

 

Figure 2 Quote from John Sing, Director of Technology presentation

 

Finally, your project is finished, your new Infrastructure Architecture is done, it is a perfect cube with multi-colored little cubes perfectly interlocked.