Social Networks

“In social science, the structural approach that is based on the study of interaction among social actors is called social network analysis” (Freeman, 2004). Social network analysts study the structure formed by the nodes (people or group) connected by the links (relationships or flow). Traditionally, researchers in this field study such structures constructed by the society of humans, or animals such as ants, bees, or apes.  They discover various patterns and examine their effects on those societies (Krebs, 2000). Social network analysis caters its contribution in economics, biology, chemistry, social psychology, information science, geography, and sociolinguistics. More recently due to growth of the Internet, social network analysis is spanning its roots in the study of the World Wide Web (WWW), email communications, computer networks, online marketing, and spread of viruses. This article provides overview of the interesting finding of this wonderful field of social networks. Note that research conducted is different from the research being done on development of social networking website such as Facebook, MySpace, etc., though analyses of social networking websites is a subfield in social network analyses.

Michele H. Jackson suggested the network analysis as a methodology to examine the WWW. He represented the WWW using the network analysis terms, such as actors, relation, network, and networks structures, suggesting that social network analysis is applicable to the network of the hyperlinks (Jackson, 1997). Han Woo Park summarizes various research related to hyperlink network analysis (HNA), and presents HNA as an emerging methodology (Park, 2003). Park’s paper implied that the use of social network analyses for analyzing the WWW is still in its primary stages and has many mysteries to reveal.

Although, the social analysis of the WWW is an emerging field it has already unveiled various interesting WWW’s phenomenon, several of them are listed by Barabási in his book “Linked: The New Science of Networks”. According to him, “Networks are present everywhere. All we need is an eye for them” (Barabási, 2003). He represents the WWW as a social network as opposed to the static web of sites. The network from by the websites is not different than other naturally formed networks such as human network or other biologically formed networks. He discusses several studies which reveal that most networks, small or large, natural or artificial, all exhibit a similar topology and properties. For instance, 90% of the documents of the web have 10 or fewer links, while the small numbers of pages are referenced by millions. He called these small numbers of pages as the hubs which keeps the network connected. Similar hubs found in network formed by food chain, chemical reactions, or airports. The WWW also observes the 80/20 rule, wherein “80 percent of links on the Web points to only 15 percent of Webpages” and also, “80 percent of citation go to only 38 percent of scientist” (Barabási, 2003). This property of the web is described as obeying the Power Law[1]. Existence of the power law distribution implies that small events (pages with fewer links) coexist with the larger events (pages with plenty of links), and hence generate an ever decreasing curve.

According to Gonzalez-Bailon, “(Hyper) links play a twofold role on the web: they open the channels through which users access information, and they determine the centrality of sites and their visibility” (Gonzalez-Bailon, 2009).  Gonzalez-Bailon characterizes the hyperlinks as the virtual medium through which information flows. Users navigate the WWW through these hyperlinks from webpage to webpage, and assimilating the information they need. But, these hyperlink networks are inherently biased and aid certain websites to become central and more visible compared to the others. Power law distribution gives rise to the occurrence called “rich-get-richer”.  As the small number of the websites has received a very high percent of hyperlinks, these websites become the “gravity centers” and have tendency to attract even more links. Moreover, search engines consider the number of hyperlinks pointing to a webpage as a key criterion in determining the rank of the webpage. Therefore, the search engines further aid in this mechanism, since people have tendency to reference top results and ignoring the rest (Gonzalez-Bailon, 2009). Barabási articulates that “rich-get-richer” is governed by two laws: growth and preferential attachments. The network starts from a nucleus and grows by the addition of new nodes, and these new nodes prefer the nodes with more number of links for linking themselves (Barabási, 2003).



Barabási, A.-L. (2003). Linked: The New Science of Networks . Perseus Publishing.

Brunelli, M., & Fedrizzi, M. (2009). A Fuzzy Approach to Social Network Analysis. 2009 International Conference on Advances in Social Network Analysis and Mining, (pp. 225-230).

Freeman, L. C. (2004). The Development Of Social Network Analysis: A Study in the Sociology of Science. BookSurge, LLC.

Gonzalez-Bailon, S. (2009). Opening the black box of link formation: Social factors underlying the structure of the web. Social Networks, 31.4, 271–280.

Jackson, M. H. (1997). Assessing the Structure of Communication on the World Wide Web. Journal of Computer-Mediated Communication, 3(1).

Krebs, V. (2000). The Social Life of Routers: Applying Knowledge of Human Networks to the Design of Computer Networks. Internet Protocol Journal, 3(4), 14-25.

Park, H. W. (2003). Hyperlink network analysis: A new method for the study of social structure on the web. Connections, 25, 49-61.

Zadeh, L. A. (1965). Fuzzy sets. Information and Control, 8(3), 338-353.

Zadeh, L. A. (1999). From computing with numbers to computing with words. From manipulation of measurements to manipulation of perceptions. IEEE Transactions on Circuits and Systems Part I: Fundamental Theory and Applications, 45(1), 105-119.


[1] Power law is a mathematical relationship between two quantities wherein frequency of one object varies as power of other.


Games and A.I.

We want to review the level of AI used in few video games.  It turns out that the majority of AI is scripted or hardcoded in logic with few academic A.I. techniques being used. There is great opportunity in the area for designing algorithms for video game industry.

1. Age of Empires

Age of Empires is a real-time strategy game. In Age of Empires a player builds kingdoms and armies with a goal to survive through ages, defend his own kingdom and conquer other kingdoms. A player starts with few workers and his/her primary building and then slowly builds a complete town and many buildings.

AI of Age of Empires is smart enough to find and revisit other kingdoms and their buildings, i.e.  pathfinding, and also start attacking them. One of the major drawbacks with this game is that AI is too predictable. A player can easily predict the path of AI which tends to repeat the same path repeatedly giving human players an edge. Also, AI is very poor when it comes to self-defense.

2. Half-life and Counter-Strike

Half-life is a science fiction first-person shooter game (FPS). Half-life takes shooter games to the next level of intelligent behavior of enemy monsters and won 35 ‘game of the year’ awards in 2004. Half-life is a more or less traditional game in which a player advances by killing creatures coming in his way. Monsters in this game can hear, see and track human players. Along with this they can also flee when they are getting defeated and call for help from other monsters to trap a human player.

Counter-strike is yet another team-based FPS which runs on the Half-life game engine. Unlike Half-life Counter-strike is a team based game with two teams, namely terrorist and counter-terrorist. The terrorist aims to plant the bomb while counter-terrorist aims to stop them from planting the bomb and killing all of them.

Counter-strike uses bots to simulate human players in teams to give the illusion of playing against actual game players. Bots actually show intelligent behavior in pathfinding, attacking opponents, fleeing away, planting and defusing bombs. Again, they also incorporate the problem of predictability. An experienced game player will be able to predict their behavior and wait for them at strategic locations to kill them. Partly, the reason for this behavior is that the majority of AI in bots is implemented using static finite-state machines (FSM).

3 Creature

Creature is artificial life programs. Software comes with six eggs, each with unique creatures in them. The game starts when an egg hatches and norms come out. Norms learn by interacting with the environment and a player watches them grow through interactions. A Norm passes through adolescence and adulthood and then eventually lays eggs. A player builds colonies of such creatures.

Being first in its kind this is the first real use of artificial techniques in video games. It uses machine learning, namely neural network as the learning algorithm for norms.