Featured Post

Customer focus is a data imperative

Age of information is really the age of confirmation and it is upon us. Gone are the days of naive customer focus termed as providing the b...

Thursday, March 24, 2022

The Website is Under Construction

 I was recently asked how do I think about a website. This was a simple question and after years of optimizing professional and personal websites, I thought about the website in terms of a discovery, and transaction platform optimized to reduce friction at each step of finding or transacting, pre or post most desired action (say a conversion of any type for simplicity). I was dissatisfied with my answer and kept thinking about it. I had only looked at websites from perspective of discovery pages (each page and action for different stage of discovery from awareness, intent to conversion and nurture) or in terms of top, middle and bottom, with the same of type of pages bucketed in a framework of a marketing funnel. The purchase funnel is a good framework, to quantify ecommerce in terms of funnel actions by page and it works very well, specially in advertising, audience targeting and optimizing pages to intentional goals.

With all the website events now available and all the elements on the site tune able at lightening speeds on client side with single page applications or server side rendering, I would think of a website as a machine, that takes inputs and generates outputs like a page. The inputs of the machine will be the context and actions with in a time scope that bundles those actions. The context can be defined by different signals which can be user specific, device specific, channel used, to any other parameters that define the intent or a goal, packaged within the URL and privacy protected client side data. The actions can be as micro as a single position of a mouse scroll. The machine can then decide, on the fly, the best way for it to run and create output. Imagine that depending on the page you come to, the machine is trained to a specific goal for that page, and now think that it constantly learns from the context and actions. To further build on the thought, imagine that all the creative elements from copy to images are created through machine learning and trained to the ultimate goal that the page is trained for, whether the goal is best consumption of the page measured by time on page, or a desired action, and even the goal for that matter can be eventually discovered by same process of testing and learning, automatically. This determination of the goal itself is the ultimate aim. Because at the end of the day, the purpose of the machines' outputs is to provide the users with the inputs that make the achievement of users' goal simpler, or clarifies their purpose or need, making it easier for them to achieve fulfilment. Depending upon the scale and scope of incoming signals the machine can run multiple outputs at scale and learn, optimizing to a target. The similarity traits of the user need only be specified through privacy protected parameters in the URL and combined with first party data, that will give the machine material to self optimize its output to the best probability of a desired next action, whether off page or on page, in future or immediately. You might think that I am talking about personalization at hyper scale, experience management platforms or content optimization and you are right. Maybe I am dabbling with semantics here but to me concept of a fully configurable machine sounds like something that will have all this type of engine built within the page itself and the output of the machine optimized for best possible outcomes, constantly learning, more like a native functionality instead of borrowed engine. The success of such a machine is heavily dependent on the quantity of inputs, diversity and quality of inputs. Maybe this is a conceptual difference and not a paradigm shift.

Regardless of the machines' internal working, the future is here where right framework defined for end result in mind, in time, with enough inputs, will allow the machine to learn the best way to serve its output, the page, to nudge the visitor in the right direction.

Wednesday, December 29, 2021

Brand positioning in About-Us pages of Fortune 100 tell us

The about us pages can tell a lot about the company, it's current position in the market, it's aspirations, its values and the leadership team. I wanted to see if there were some key themes that emerge when we look at the about-us pages of the largest companies in the world. The fortune 100 seems as an obvious choice. Due to their impact they have a lot more to say, hence the about us pages are not just that, they are really about the world pages. Although I explore lot of websites reading 100 about us pages will take a lot of time so I resorted to text analytics. To simplify the text analysis I used mostly either "who we are" or "what we do" pages and although most fortune 100 are multinationals, I explored English language pages. In some cases, if the values page was readily available I took the content from that page as well, but that was exception then the norm. The aim was to get a general idea of topics on top of mind of these organizations and see if there are collective themes. Some of the companies did a great job of their first page but in many cases the about us was split into many sections. The argument can be made that it is tough to simplify the essence on single page. But some are doing a great job of it, for example GE's about us page does a nice job of putting re-invention front and center. For my purpose I found the "teaser" about us page sufficient for text analysis. To reiterate, mostly I was interested in "who we are" and the topics that surround that theme, then anything else. I found that the companies in similar domain (industry) had similar themes in their content and it is interesting to see how brands are positioning themselves in their space.

Let's get the methodology out of the way. Three techniques were used, including word cloud, frequency distribution of words, and latent dirichlet allocation (LDA). This post is not about LDA or text analytics, most implementation was borrowed and tweaked using internet. Not much originality is claimed in the text analysis. My interests lie in the essence of corporate branding and emergent themes top of mind of top organizations as manifested in one of the core messaging content piece on the digital front. The analysis also lends itself to other avenues for page optimization, analysis of content at scale for competitive or simply forming a robust opinion.

Word cloud, as nostalgic as it is from the early days of text analytics, it is still a very useful tool. It paints a picture of words that sets the stage for deeper look. Even at the birds eye view, if one of these large companies are your clients, then you can get a starting point on what general themes are coming through. At a glance, healthcare, which is mainly insurance carriers, messaging is centered around quality of service, improving lives of people and patients, comparatively the pharma, like J&J and Pfizer, are more research oriented focusing on medicines to cure diseases around the globe, the oil & gas and utilities' big picture revolves around long term sustainability beyond petroleum and protecting the world resources, the automobile industry is on electric and zero emission cars yet keeping safety front and center. Overall the messaging of fortune 100 companies as gleaned from about us, "who we are" and "what we do" type of pages in a world cloud, revolve around people and world, and the communities. It is lot more about people and communities then just about the product or service that the company offers.No alt text provided for this image

Another simple view complimenting word clouds is frequency distribution. It gives a more quantitative view of the world clouds.No alt text provided for this image

The general themes of customers people, world, communities remain on top. The global health crises that consumed the better half of 2020 and in 2021showed up in every organization as these organizations or some part of these companies is engaged in solving the pandemic crisis.

For topic modelling I used LDA, it allows interactive display to play with different topics, which is quiet fun. Those that like to know the devil in the details aka the model intricacies, quick synapses below. 

The only word types used were NN,JJ,NNS,VBG,VBP,CD,RB,VBD,VBZ,VBN. The coherence curve revealed few optimal values and after a little experimentation I settled on number of topics of 18 that gave above .33 coherence with a perplexity score of -10.5 for combined about us pages of all 100 companies.No alt text provided for this image

Topical analysis is a very useful tool for training intuition with scaled information synthesis to key themes. The topic analysis revealed some recurring themes. As mentioned earlier the coherence scores improve greatly when the analysis was narrowed to a particular vertical. The improvement of topic coherence is directly related to recurring themes for companies in the same vertical. Taking the case of Utilities vertical with energy companies 3 unique topics come up. The distance between the topic circle also shows that the topics are relatively different. With 1st topic shaping to be around energy future with renewables, safety of employees the 2nd is around conservation of natural resources and third one around the scale of the company and different way to produce electricity. Now these topic definitions are pretty subjective but as the scope of the context narrows the topics become more definable.



Generally speaking the fortune 100 messaging is around, people, their impact on communities, their commitment to the planet and their customers. Health and safety were present nearly in all verticals. The key stake holders, customers, employees across the globe were part of the narrative. 


Sunday, October 31, 2021

Electronic Marketplaces, participation considerations

When it comes to play or not play in electronic marketplaces a good majority of economic agents are still debating the issue. The question about participating or creating an electronic marketplace is pondered by producers and intermediaries alike. Most of the manufacturers, retailers, wholesalers, distributors etc. already have some semblance of online store that act as multisided electronic marketplace with myriad of participants on demand and supply side. The question is whether to become a supply side partner in an established marketplace if it serves the intended consumer of articles offered or own an electronic marketplace connecting demand and supply sides; in the name of finding new avenues of growth or improving transactional efficiencies. The overarching reasons for such a debate stem from eyeing the platforms with increasing share of demand side and the potential network benefits of building a demand or supply side aggregations using digital competencies. A detailed understanding of electronic marketplaces is needed to make such decisions.

Electronic Marketplaces (EMPs)?

The classification of electronic marketplaces is a complex endeavor even if it is sifting through the available literature and collating attributes. In most basic terms electronic marketplace can be defined as an online platform that connects participants from demand and supply side and facilitates transactions digitally. On a more technical note, electronic marketplaces can be viewed from perspective of different dimensions, such as the number of participants, ownership structures, market mechanisms, relations, articles offered etc. In the interest of this writing, classification based on co-ordination of economic activities through aggregation, collaboration, auction and exchange, are curious paths to explore. The other two dimensions that are related to the topic here are the number of participants from many to many, one to many, or few to few, and lastly the market or hierarchical orientations. Since this is not a conversation on economics, it is practical to assume that EMP is a business oriented digital marketplace which combines concept of governance and process optimization. Amazon business for example, can be considered a hybrid agent, focused not only on governance of transactions by reducing the search and pricing costs, but also on creating hierarchical optimization of related processes from relationship management to streamlining procurement workflows. I would think that most multi sided EMPs have similar designs or are moving towards it. 

Commonly EMPs are termed as, e-hubs, exchanges, portals, auction houses, pricing engines, extranets, comparison engines, meeting places, collaborative platforms, e-stores etc. The proprietor ship can be neutral owners to competitive owners. The definitions, to a degree follow a narrow or broad needs of transacting parties and their industries with varying number of participants in either side of the platform. As a governance mechanism EMPs reduce the coordination and search costs while connecting sellers to buyers in an efficient manner as compared to searching and communicating through other medium. An argument can be made that the efficiency and ease of transacting through an EMP can be achieved with other medium if a relational context with trust in few key suppliers is paired with expertise on specific assets and their application. In that case the broader online marketplace becomes the means of lowering discovery and search costs where the buyer in few clicks can find the competitive metrics and still choose the trusted partner. As the risk associated with a transaction increases the EMPs pose an adverse selection argument. To that, the technology has come a long way in a short time with connected systems and specialized platforms covering every business process. The enhanced content forms show cased through sophisticated presentation layers are bringing senses closer to acceptable validation. Couple this with the inherent transparency of internet, the adverse selection costs will continue to decline albeit at a different pace for different assets.

Why the question?

If EMPs improve transaction efficiencies and provide a critical mass to demand side, why the hesitation by any supply side participant to join a thriving demand side EMP? The contemplations are not without merit. The most prevalent concern is the cost of disruption to status quo. The core tenets of that fear are destabilizing an existing eco system, the value based hierarchy which is exhibited in discounts on either side of the platform, the competitive pressures resulting from EMP (competitive) owners’ participation on top of other partakers, and the loss of customer intimacy which, although, high in transaction costs, is still a valued asset for intermediaries and producers alike. The threat of replacing high cost, high value relation with, an impersonal and transactional in nature, digital intermediary, can undoubtedly have some, long term consequences and business model implications. The discussion is exasperated by the evident aim of EMPs to provide more substitutes and product liquidity, (which is understandable since demand side participation is partly fueled by competition driven choice and price). Whether one enters perfect or pseudo perfect competition the end result is the same. The marginal cost is the eventual expectation of return for participants in the long run and once you enter the marketplace the pricing strategies from backward induction point to same result for survivors; profits slightly above marginal costs or at marginal cost. Unless a single participant offers endless products on an EMP it will eventually exit as the price game at each single product level will become an attrition war, and the profitability thresholds are breached one by one.  

The proprietor of the ecommerce marketplace such as Amazon, Walmart etc. strongly mediate their eco systems and mandate certain conformity, a necessity in their eyes for building and maintaining a critical mass demand side. These established EMPs essentially absorb the transaction governance costs that would been transferred to participants transacting in other mediums, yet they come with low to zero onboarding costs for supply side partners while offering a well-established demand side. It takes considerable resources even with skilled partners to launch an EMP and create a desirable demand side to attract supply side partners. The resources needed to handle the complexity of creating an EMP and the critical mass of demand side needs to be balanced with any realized and perceived efficiencies. 

The importance of access to information and the ability to unveil opportunities through information, cannot be ignored. The current technological advancements undoubtedly provide different means of exploiting slightest of information edge than ever before. Although the EMP proprietor may give all participants access to, some competitive information, not all is given or can be given hence creating an information asymmetry that is not the hallmark of physical marketplace. In physical marketplace, players have similar uncertainties and information asymmetries, and each player according to his understanding of the market and its value chain may have a leg up through specific information. Competitive EMPs differ from physical marketplaces in this regard. In EMPs the requirements of property ownership and the cost of participation are lower. In an EMP the information unbalance favors not only the consumer but also the marketplace owner specially if owner is also a participant. The free flow and lower acquisition costs of information, shift the market power to the marketplace proprietor and consumers.

The producers creating a hierarchical marketplace for suppliers have the potential benefit of consolidation and supply chain resilience but that can come at the cost of buying power if the marketplace is many to one. On the flip side the entry by producers to EMPs as supplier even with limited assortment may be perceived by existing intermediaries as a less risky foray by producers to eventually replacing traditional intermediaries.

Produces considering third party EMPs also need to think about substitutes. This is not a new market phenomenon, but the EMPs drastically reduces the search costs and with the improvements in machine intelligence the EMPs can offer fast and accurate substitutes, all of which further the opportunistic behavior. Even if the brand name is a synonym for the article and its application, EMPs can speed the erosion of brand connection to an application due to availability of easy substitutes therefore increasing producer’s risk. 

Conclusion

Core value proposition of the multisided marketplace owner is the facilitation of the matchmaking between participants from demand and supply side and the enabling of transactions at lower costs. Whether one is connecting a certain demand side to a specific supply side or just making strides from reselling intermediary to a multisided platform ownership, the key question is understanding the value creation. Success factors include critical mass, collaboration support and measuring critical mass by the number of participants or transaction volume per participant while balancing channel disruption, brand dilution, and competing as a substitute, or with the substitutes. Whether you draw influence diagrams, resort to payoff matrix or look at indifference curves for the target audiences there is enough fun in the decision-making process. It is a matter of understanding participants; the true nature of demand and supply side and what exchange is being optimized that will fuel network effect.

Sunday, July 4, 2021

Customer focus is a data imperative

Age of information is really the age of confirmation and it is upon us. Gone are the days of naive customer focus termed as providing the best service to the customers. Customer focus, now, is the ability to understand the customer and serve each customer based on their particular needs and expectations while increasing their value to the organization. The key here is to understand before serving and the level of understanding determines the depth of relationship with the customer. The best form of understanding is quantifiable knowledge, attained through cycles of data analysis converted into domain expertise, is empirical in nature and forms the basis of apriori expansion.(We can have separate debate on the chicken and egg dilemma of whether an idea comes first or the experiment to knowledge foster ideas)

Role of Technology & Data 

Why and how does technology and data help shift focus to customers? I will quote from one of my favorite reading on the topic

 "in an ideal customer centric organization everyone is focussed on increasing the customer value and understands that this requires learning from each customer interaction and ability to use what has been learned to serve customers better."
 (Micahel J. Berry and Gordan S. Linoff )

The authors go on to profess that such an organization records each and every customer interaction. In short the firm keeps an extensive historical record of every customer interaction. But a practical caveat to above is that the extent to which the organization leverages such information to manage relationships largely depends on the multidisciplinary capabilities of the firm. What we do know from evidence is that the most valuable companies in the world are able to do this better than the rest.

Before we get further into this debate we need to explore few related concepts and then we can direct a company's ambition to be customer centric. The key concepts is what I call the 3 Cs of customers focus for an organization.

1. Customer
2. Customer value
3. Customer Relations

Customer

To measure anything we must define who the measure applies to what is being measured and to what extent the concept is explained by the measurement.  Is the entity an abstraction or a tangible body e.g. customer is an abstract concept and the person or a company is a tangible entity, similarly a household becomes the abstraction and husband, wife, child1, child2 etc become entities. The abstraction may encompass number of organisms or further composites that can be quantified by some aggregation of individual measures and some grouping of individual properties of its entities, 

The abstraction of the customer can take many forms depending upon the definable singletons, specially in B2B scenarios where the entity can be an organization, a building, a department, a person, a function, a title or a combination of these, which in turn can be cumulations that can be further dissected to users of products and services or appropriators(pre and post sale included) of the same. This can raise questions like is the customer the service or product acquiring entity, or the payment facilitating agency, is it the end use facility or the end user, is it the entity paying the bills, is it the user making the transaction on behalf of the end user or organization; any of which can be stemming from a single or multilayer abstraction. 

The answer to customer definition dilemma is not single or hierarchical groupings but a correlated set of abstractions of various entities that have measurable dimensions of interrelated attributes. 

Customer Value

Ultimately the customer value to the company are the transactions (historical and future discounted or not). Do we need a rigorous analysis of customer value? And why can't we just sell to everyone that wants our goods and services? If an organization can sell and serve everyone, that desires its product then it absolutely should. In reality there is a cost for the enterprise, to be discovered, considered, chosen and carry forward, attached to every phase of selling and serving. This problem is mitigated when the firm has either a large customer base where the volume of transactions subsidizes the cost of intimacy with masses or leverages technology for customer relationship management albeit at a cost of intimacy. The other ways include marketing tactics to increase mindshare at various phases till it becomes top of mind.

Critical is to understand that the customer value on a time horizon quintessentially measures the effectiveness of the acquisition not in terms of how valuable the customer is but how valuable the essence of the interaction with the company is to the customer. To an extent one can predict the need of the customer, bucket it in value segments and then try to acquire those classified by similarity factors, in hopes that just by definition the inherent properties of the firm will resonate with intrinsic customer nature and the firm will increase customer value. In reality the acquisition is simply enlarging the sample set to test the enterprise value itself. 

At the end of the day business is to fill a need by providing products or services. Knowing how the need arises, or creating the need in first place, understanding how the need is being filled and what the consuming organization holds as valuable in the process of identification and fulfillment of its needs is the landscape of customer value creation.

One can say that life time value measurement is brand equity governance.

Customer Relations

How can an organization have better customers relations? By understanding its interactions with the customers! The point to note is that an interaction is not just a record of every transaction (recording a transaction is a given must for most organization with any real legal bearings) but it is the record of every implicit or explicit touchpoint. It is not only communication in traditional sense, it is any and all touches, implicit or explicit, the customer has, with the organization, from minutest glance at a marketing collateral to the experience of an hour long call with an company associate. Implicit interactions maybe defined as the ones that are not based on direct two way communications but the customer witnesses in company operations, its marketing, its service pre and post sales, all its visual components, messaging components, informational pieces, fulfillment timings, packaging details the list is huge and the importance of implicit interactions cannot be overstated. That is where technology has shifted the pendulum in favor of companies that can understand implicit interactions at scale.

Few years ago the quantification of a customer relationship was mostly transactional (be it POS type transaction or web based activity; a click is a transaction of sorts). It was difficult if not practically impossible to gauge the essence of all interactions, that is spoken, unspoken, written and unwritten observed and unobserved. With the improvements in technology to capture, gather, store and analyse at scale the transactional history, and footprints that can be detected and the users that can be identified, albeit at cost of privacy, the true nature of customer relationship can be defined. 

Although data is termed as "the new oil" , gathering of data itself has to be tamed to customer expectations of privacy as well as maintenance of service levels. If the customer is bogged down by providing data to the organization, more than likely it will stress the service and the relationship at the contact point. That's where data mining is most effective as it can connect the dots between end points where the customers voluntarily give out information as part of the expected interaction or transactional necessity. The organization can accurately know, not only its (soft) relationship with its customer but the value of that human communication previously left to the domain of human understanding and appraisal. The advancement in speech recognition as well as gathering and connecting expressed opinions using NLP gives an organization further insights about its customer, previously not possible.

Customer focus is understanding the composition of your customer, measuring the effect of all the implicit and explicit interactions and their value to the customer, and finally building the relationships by affecting every interaction and ...lets not forget continuous evaluation of the 3 Cs of customer focus.


Simple yet not!

This seems simple enough but in reality the category managers and every one from call center associates to top managers are usually focused and rewarded on amount of product sold while trying to exceed the firms service touchstones and the margin made. That culture shift to measuring and increasing customer value on rolling basis is different, if not difficult. To blame the product focus on culture or the product profitability mindset is a naivety and who can really argue with growing the company profitability. The challenge is to translate those product margin metrics to customer value metrics and tune the sum product of organizational effort to measuring and growing customer profitability, which may very well be product profitability in customer definition and customer value but it has a different long term impact on how the organization is held in the eyes of the customer and the enterprise's ability to manage those expectations.

The organizational imperative is to create every type of customer definition, find out value of each, record all interactions, map them back to value tiers, connect operational excellence to value tiers, and product or service, connect value segments to all interaction types then find out the best models that predict any and all of those clusters. Finally choose which classifications and associated measures are best at predicting the company level KPIs. At the end of all this activity a virtuous cycle will emerge creating an ecosystem of generating insights from people and data, and using those insights to tune implicit and explicit customer interactions from training of staff to tuning of resources and operations.

Ultimately, creating a virtuous cycle is a joint venture between technology, data, analysis and people.

Saturday, February 13, 2021

Predicting Google's next acquisition

What does the company that has almost everyone, in the world, touch its properties once a week in some way want more? What is the real race?

We know by now the race.

People are getting intelligent(or let's say digitally savvy), information is getting larger, discovery is getting harder, computing power is increasing and algorithms can be processed.

In the quest to get ever so close to predicting the action, you have to connect intention and context. The intention is non linear but is predictable. That's where the the build up to intention has its own trail of breadcrumbs that we leave, thanks to dopamine that our brain receives with every touch; a context is spun, an intermeshing of connected strands. But those bread crumbs and those signals still can't match what's really happening; inside us. 

Close to your heart and mind, in literal terms, what is happening in you, within you, that is the last frontier of connecting to a human. The genetics are decoded. The carbon form can disagree with its disposition but the composition of its code along with the indicators of his body and mind, and the temporal shaping forces of environment, circumstance  will be more accurate than the subject's, ... you and me, assertion of its own path to a decision . Emotion and motion are connected, physical reality is spiritually inspired. That leads first to acquiring the physical vitals of the carbon form, and although they are good for many a conclusions (think Fitbit), they are still not enough, close enough is not enough. 

If you have a computing device that can presumably process and break long bits of intertwined complexity, you can definitely feed it the vibrations of a human brain to decompose. And that my friends will be the next play for the big players. Of the remaining senses to be captured, thoughts are the only direct playing field up for grabs. 

I have not dug into which one of these already have investments from big four, but I believe this is where you will find the next acquisition, or maybe a close akin to it. Elon's got something in mind, he is thinking of your mind...

Wednesday, February 5, 2020

The missing 'C' in the 3Cs of customer focus

My last blog was about customer focus and while thinking about customer centricity I talked about customer focus in terms of 3Cs from organizations perspective, i.e defining the customer, calculating the customer value and then building the value over time through customer relations. But these strategic customer imperatives from ESOs (Environmental Serving Organization) or let's just say a profit seeking organization's perspective, miss out a critical detail, the customer's point of view. While the company can think all it wants in terms of customer focus, customers rarely mirror the company's point of view. Customers; we, you, I, us, they, all have their own agendas, driven by myriad of factors as we go on our merry journey to acquire whatever we wish to acquire. There were times when the discovery to transaction the journey was tough for the customer since the information asymmetry favored vendors. The information transparency has tilted the balance in favor of the consumer and no matter how customer focussed the organization is, the ultimate choice in marketplace lies with the customer. You can sway and bend the choice but you can't own it, unless you are a monopoly (a case we will leave for regulators to solve and ignore in our discussion). Our aim serves a more down to earth purpose, to define the 4th C in the customer focus as the "Customer's" focus.


We will need to find our way in a heap of information to prove a very simple customer choice model, almost too simple to be true, yet boldly we will venture into the complex, seeking the simple. The study of choice will take us into mathematics, statistics, econometrics, political science, computational social sciences, psychology and its cousin behavioral economics to name a few. With the aspiration to develop a simpler customer choice model, we will need to boil the choice theory, decision theory, expected value theory, prospect theory, and maybe some other theories, into something that is relatable and practical for everyone in the organization to understand. The ocean of knowledge that surrounds the human choice and the factors that effect it, leaves the sanity of such an ambition for this blog, a bit questionable and this is where I have to put forth the slight caveat for readers benefit (as most my blogs do) that this blog, afterall, is my diary and a means to synthesize my thoughts and any synthesis of information hereof should be taken only as an encouragement of further inquiry in to the matter.

So lets jump in and start our line of investigation by assuming a simple hypothetical customer choice model of our own design and say the customer choice depends on three basic tenants:

1. The concept of price
2. The concept of ease
3. The concept of security


Keeping the above pillars in mind I will divide my study of sorts into two diggings:

1. Quantifiable Utility
2. UnQuantifiable Utility.

Or we can also examine in terms of

1. Subjective utility
2. Objective utility.

I know all this theoretical mayhem sounds archaic and professorial but we will touch upon some very current mechanisms that effect customer choice, from personalization, digitization, branding to the role of big data, AI & machine learning in bringing the classical theories to new life.

We are not trying to justify any of the above constructs, we just want to take our thirst to the existing reservoir of understanding and borrow a little of perspective from each cup we draw to stitch our story and see if the resulting mosaic leaves an impression of sensibility.....

Sunday, July 29, 2018

Theory X, Y & Z... Somewhere between Taylor, McGregor and..... Darwin

What is more interesting a DAG, slicing tensors, or discussion on human motivation or just taking the human out of the equation and using RPA ..... "Widgets as workers" are definitely something to look for, but I won't hold my breath for software cyborgs to raid the enterprise just yet; repetitive tasks are bound to extinction, that I wont argue. As I look for deep learning the brain of the organization, and fuzzy algorithms, an offshoot to indulge in, the fact remains that organizations are more human than machine; they are more nuanced than an exact science.

You can't motivate an algorithm, you can tune it, you can make it better, but its cold calculations can only handle tasks it is meant to optimize. Algorithms have few fixed needs and hence are boring, they are only interesting while in development. Humans on the other hand are infinitely more fascinating and just for that we will continue our discussion.

So the question is how we lead ourselves and other warm machines with their pattern recognizing cognitive processing to peak performance. Are our organizations Theory X based, or Theory Y based, or maybe they decided to have the last "letter" on it and adapted Theory Z or are a mix of X, Y and Z along with some borrowed concepts from F.W Taylor's principles of scientific management. There is much literature available in field of organizational theory but for this blog I will focus on few excerpts from the mentioned works.

Some of the assumptions in principles of scientific management such as the tendency of the average worker to take it easy and that most people in an organization are average, is a bit hard to swallow and seems a harsh assessment of human nature. I do agree that the "employers knowledge of a given class of work gets tainted by their own experience which gets hazy by age and casual and un-systematic observation of their men"....(aka MBWA and although Gemba may work on shop floors, it will fit the above statement by Taylor) but what follows after wards seems very autocratic and Victorian explanation of employees...the evolution of organization is missed since the measurement of knowledge worker is a more excruciating a task then measurement of a factory widget maker. Even if the modern worker is software widget maker the complexity of innovation, the abstraction of work and varied paths to revenue realization make management of initiative and incentive that more difficult.

On the other hand it will be too short sighted to quote and nit pick few choices from Taylor's seminal work. The premise of Taylor is employer and employee are in it together, and there is a win/win for both. It does however seem to portray the "average" worker as a more controllable form of a resource which is inclined to "soldiering". If the traditional knowledge of the workman is converted to rules, laws and formulae by data collection... productivity enhancement through AI and machine learning will make Taylor proud, and not in the least at the cost of the worker but in hopes that as repetitive work is handed off to machines to continuously improve, the worker is inevitably forced to evolve into a more thinking being as science replaces rule of thumb, and process follows a trial, and co-operation is developed based on process and eventually despite the nature of work there is equal division of responsibility between management and workman. 

By now you must be thinking where are the X, Y and Z. Theory X and Theory Y are theories of human motivation developed by Douglas McGregor. In (over) simplified terms distrust of employees is Theory X and trust in employees is Theory Y. Theory Y is where Maslow and Scanlon Plans come in and Theory X is the command and control model of military driven largely by employees' trained to accept the dictates of management. But in the age of information even military has to evolve and quoting Stanley McChrystal from Team of Teams, where he sees military as "....not a well oiled machine but an adaptable , complex organism"  pointing that the game, even for a very structured and hierarchical organization like army has changed from command and control to more empowered smaller units.

For some background on Theory Y and Z and we need to go back in time and sift through the very enjoyable papers by Maslow and the Scanlon plans, and while we will ignore the Hawthorne studies and Murray's system of needs, we will use Maslow's hierarchy of needs as encompassing similar concepts. Maslow's studies are geared towards generalization of theory of motivation based on physiological and "softer" human needs and the Scanlon plan had its origination in a more tasked organizations such as manufacturing plants, but both of them in some way influenced Theory Y. I will skip the Theory Z of Dr. Ouchi as it is based on similar principles of Theory Y and extends it to Japanese cultural ethos (and I haven't had time to really go through it in detail).

The scientific management may be able to fine tune tasks by timing and optimizing details, but humans, the predominant productivity engines of the organizations, are not just driven by monetary incentives, besides there is always a wall to financial lures. Theory X and Theory Y are specific way of thinking about organization human resources. Concepts of soldiering and loafing are usually attributed to the Theory X that pictures employees as devoid of ambition, zeal, drive and commitment. Theory Y on the other hands give employees a more positive nod. Although Theory X gets associated with Taylor's concept Soldiering and loafing, it is a very narrow view of principles of scientific management. Practically, if you feel Big brother is watching,  and employees are coerced, controlled, directed by punishment, you are in Theory X environment, If you feel you have opportunities to learn and grow with the organization, and  the other needs like self esteem, self respect, self confidence, autonomy for achievement, competence and earning deserved respect of fellow beings, are met for the individual (for now we can leave the discussion of quantifying those to the side) you are in Theory Y. I will be the first to admit these are not easy distinctions to make since there are lot of gray areas within each and most organization employ a mix of theory X and Theory Y but in the long run companies, culture does shift in one way or the other. Also both Theory Y and Taylor's principles eventually point towards bridging the gap between employer and employee needs, the mechanisms may differ in each.

Lastly what are the evolutionary effects of Theory X and Theory Y, along with Taylor's principles of scientific management on an organization. Here I will borrow, rather generously, from Darwin's basic theory, with organizations as a specie. If we think about it McGregors basic tenants of Theory Y are evolutionary. An environment where a living cell or  a being continuously evolves is a akin to an organization where it's employees grow. Each generation of employees will have individuals that will meet the rigors of business and will help sustain or grow the business, those are survivors. The interaction of individual traits of those survivors with organizational environments will produce, observable, desirable attributes and collectively the organization will become a specie on its own with company culture as its life force. If unable to collect enough "survivors", it will become extinct where by its survivors, those workers with desirable traits, will either form a new species "entrepreneurs" or will be adapted by the surviving collectives, competitors, to start their survival cycle again.

It all starts from what theoretical assumptions management holds about managing and if the collective behavior of the organization is, a consequence, a cause or a symptom. Evolution is not voluntary, it is mandatory.... now lets think where each of our organizations lie in the spectrum of cultural evolution from A to Z

Wednesday, June 20, 2018

Your employees are not your best asset

...the asset is your employees ability to process information specific to your business processes and making informed decisions to benefit the customers. One can argue that it is same as having smart employees, and a company can just hire intelligent people to create value. Critical here is the information specific to the company, generated through normal business workings and what portion of that information actually becomes experience and that's where the future is being defined.

            The case in point is the availability of information the power to process that information and how much of both is needed to make profitable decisions. Sane decisions can be made with relatively less information but that is at the cost of risk. In absence of accuracy the risk must increase and the processing unit, that is your employee, must rely on his/her experience. The purpose of data and algorithms is not fancy charts and stunning visualizations, it is to provide simple insights that reduce risk by augmenting experience, and sometimes even correcting it.

             It depends on use case but most non life threatening business situations require only a certain accuracy to be meaningful. The data doesn't improve the verity, but the processing and tuning the machine, whether it be the functional areas of your organization, warehouse operations, website optimization, speech in sales calls, transcripts in chats, features in your products or thousand nuances of your various customer touch points, or the effect of a light spectrum on every leaf of a plant in a hydroponic farm, does.

              When you compete at human level, it comes down to domain knowledge, experience, intelligence, creativity and endeavor to get ahead. The combination of data, if it is appropriately labeled and collected, to define the context and outcomes, with the transforming power of distributed computing and iterative algorithms can enhance employees. We already see world dominated by companies where revenue to number of employees ratio is order of magnitude greater than average. The ability of employees to create value for the company lies in their expertise to enhance its product and services, that can either command a premium or can be sold in numbers, both of which require that more is done to the product, mandating same individuals doing more of what needs to be done at faster speeds and greater quality, or leap fogging through innovation.

             The democratization of static and iterative insights and automation of required actions, build upon the scalable computing stacks in cloud and out of the box algorithms that fit various business cases, will define future competition in a different way. Yes the employees will remain an asset, ignore the attention grabber title, but as the availability of data and ability of machines to crunch and understand the data increases so will the competition and it will test the employees ability to harness data and ingest the real time recommendations. An understanding of how the organization can augment the information processing power of its employees will create scalable advantages for the long term.

Saturday, May 5, 2018

Deep Learning the mind of an organization Part 2

Blogs, they are wonderful, they live and they express. The quest for knowledge is not a trivial pursuit and the magnitude of captured thoughts so huge that one can comfortably traverse the variants of their own mental pathways without a regard or concern for already tread paths. But caution must be exercised when venturing into deep forests of imagined realities as what may seem as a discovery of a new universe might have already witnessed visitations by many an earlier interpretations, but none the less the pleasure of following the flows of your own design should be done fearlessly, and maybe even just for fun as this blog.

Not to diminish the complexity of high level abstractions, it is a bit more tedious to predict the required calculations. In part 1 of the deep learning the brain of the organization I simply stated that it is possible to capture the signals and learn from those indicators how the organization can be predicted. If the organizations actions can indeed be interpreted as the manifestations of micro decisions of its people in various functions and their combined efforts as the production of one or many desired outcomes then the information flowing through those synapses has to be an order of magnitude larger than neurons firing in one brain. 

Where do we record such massive information from a single system of immense complexity. The search of that recorded complexity lead me to the black box. The black box, aka the flight recorder records every single input to a flying object for short period of time, including the sounds of the buttons, the chatter of cockpit and whole lot of other data. The BFU regular findings though interesting, I lost my interest pretty quickly. The concept though was the same, every thing must be recorded to understand the incident.

The science of understanding processes is hardly new and it drove me towards the path of DSA. The analysis of complex systems and decision processes, in the context of learning the workings of the organization, is inevitable and I really like the concept of "fuzziness" in human reasoning to summarize ideas by Lotfi. The concept "use of linguistic variables and fuzzy algorithms, .....provides an approximate and yet effective means of describing the behavior of systems which are too complex or too ill-defined to admit quantitative techniques of system analysis", and what systems can we claim are more complex and ill defined then the ones shrouded in idiosyncrasies of human interactions to come up with a "methodological framework which is tolerant of imprecision and partial truths".
My struggles with fuzzy algorithms, universe of discourse, fuzzy feedbacks and fuzzy instructions, in the limitations of time I can allocate to such explorations, are out of scope of this diary, but the ventures indeed will be recurring theme in following posts.

Saturday, April 14, 2018

Digital Utility Agents and The Fabric

Web rooming and show rooming are not new concepts. Show rooming....you are right in gawking out on this as show rooming is so yesterdays news that ended with some people using Amazon's bar code scanner on brick and mortar. I think we can add to those two terms up streaming and down streaming as the online version of show rooming when visitors to one cite go some where else to transact. If you are the party creating a deliberate redirection you profit, for other cases an equilibrium is eventually reached. That equilibrium for some companies is the end of the business and, others struggle to maintain or improve status quo and yet others carve a place to thrive.

        Although humans have a natural tendency towards path of least resistance and we lean to the comfort and security of the known vs. unknown, the reality of the internet is no barrier to free roam and no shame to change loyalty. We all gravitate to more frequent as more stable and many times the benevolence of the customer towards carefully crafted and free content does not offset the lethargy of creating a new account and  the inertia of easier to deal with existing relation. As the users tolerance of ever increasing content and marketing mechanisms reduces to the point that they automatically lean on the path of least resistance, the path that is the known "eco system" owned by few companies or the path where big chunks of the e-commerce funnel is owned by few companies, the race for keeping the users engaged on their platforms heats up; the quest of large technology driven companies.
       
          The future carves its own path and there is always a chance that companies vying for internet of things, autonomous vehicles, AR, VR, AI, HRI (human robot interfaces) and BCI (brain computer interface) get their piece of the computerized commerce and that is one of the reasons why many companies are trying to capture the next phase of computer intelligence by investing in AI hoping that their intelligence becomes the core of the digital fabric or it comes to be the core of some integral part of the digital fabric. For now innovation and early entry into various important blocks like Smart devices, Operating System, Information scoring hubs, Communication hubs, Voice assistants and many of the everyday DUAs (digital utility agents) that we rely on and engage with on regular basis come from few big companies. The digital utility agents are the software and hardware widgets, gadgets, sites apps and everything in between. All of that digital abstraction that delivers us some function, task, or emotional engagement, I call it a digital utility agent.

          While larger players are on a constant quest to create product and service adjacencies in their offerings and to weave themselves as intricate part of our day, from reciting the morning news to turning off lights at night, other companies have to try to make their offerings a requisite part of the digital fabric by finding ways to either become part of the undercurrent that moves the users towards the hubs controlling the information flow or to become hubs themselves. The future will belong to the companies that are working to become part of the digital fabric not just using it.

Wednesday, January 24, 2018

You are the cookie

You are the cookie, and the world is the browser. In case you were wondering about privacy on the internet, I think that is the thing of the past, soon, think tomorrow, your voice will be your thumb print and your face will be your id, and may be you will not be connected to a pod supplying power to the matrix but you will definitely be a series of bits tracked across the finite space that you engage in.

The last remaining battle front for the user 360 is their life at any moment. Thanks to some of our enterprising human counterparts who are always devising ways to trade their cunning for some risk, you can't really walk into any sociable establishment without having some sort of a camera scan you. The reel that needed to be viewed frame by frame a million times over to figure out if a person was wearing glasses can now identify the antagonist or the protagonist of a frame in matter of milliseconds. The same person will leave a trace of personal attributes on voice enabled devices; from speech to speaker, the tone and the timbre of a connected you.

The good thing is that you can stroll into any location offering commerce of any desirable merchandise or service and you can simply gratify your need and go without acknowledging the trade, except that you will be deducted of your net worth automatically and remit you will. Now if you thought credit cards created a false realization of dreams, without real credit, (calculated risk is not real net worth) imagine what the "distance" of never having to think about the payment will do to saving for the future; the pig will starve.

The point is not whether your journey will be private, it never really was, since the inception of credit cards, but the dots were never as connected and as easy to cluster as today. Privacy concerns then are not to be troubled with, but the real dilemma is when you were a distant moving target, you had a chance in a world full of arrows, but when you are a bulls eye that an ad or a drone can pin point with near perfect accuracy it is a different sort of debate. Then you become the cookie in a world that you browse in and a cookie that you just can't delete.

Sunday, November 5, 2017

Deep learning the brain of an organization

The world is full of beautiful ideas, and people making something with them. While bouncing between pragmatism and idealism, playing with MNIST and CIFAR datasets, figuring out entropy and trying to understand it with information theory, I came across such a marvelous representations of information in words and in visuals that made me wonder in awe at the gift and power of thought put into action; electric signals transforming into neons of imagination.

So how can we create the brain of an organization using deep learning and artificial intelligence. It is not, too big of an idea to get one's neurons firing in all directions as future of computation points to "quantum" leaps in processing and we can let our suppositions soar.

The decisions in an organization are dependent on almost infinite factors. Just the decision makers psychological propensity to make a particular decision seems like something difficult to measure quantitatively, let alone trying to gauge the impact of various departmental interactions, political and vested interests; a summation of all motives. This becomes a question about intelligence and if the AI actually can point to intelligence itself, discerning between desirable vs undesirable attributes in an organization when a medley of attributes result in desirable outcomes lest we assume that the accuracy of the decision is determined by its success or failure alone.  The problem then is determining what human signals can be characterized as positive or negative influences and what initial fixed proportions can be given to each. This may be done without giving a fixed weight to each human feature, but in order to do so, many samples will be required for the same decision with the data on the presence or absence of features effecting the sample decisions.

Human interactions are complex and the emotions and rationale that goes into human buying decisions can be multifarious, yet from a organizations perspective the decisions that affect a sale can be few ( for the sake of simplicity we will only think of a decision at the moment of transaction being based on current circumstances without considering the effect of past experiences with the firm). We can assume that creating the perception for its product or services as the most logical as well as the most pleasing choice is thus the goal of an organization. The variables to entice the buyer to associate the store as the most plausible (I will not say effective here, lets just assume that plausible includes some part effectiveness along with any other complex mechanics happening in buying decision) fulfillment of his need (or desire) for the product or equipment or service can be infinite but the variables that the company effectively controls and manipulates are not that many and can be pretty much based on standard principles of the functions that effect the process.

For designing a brain for organization we need functions of each department that are determined by signals from roles in each of those departments. The results of decisions of each department become inputs to a layer that can then be trained for a specific goal for that stage. The final output of an organization decision, will flow from the first high level input nodes to the deep neural network of functions, with each layer providing their decisions in cascading layers that ultimately yield the output.

Soon almost all qualitative and quantitative components of organizational processes and their outcomes will be recorded, some already are being stored, including human behavioral aspects. That along with increased computing power, supervised and eventually unsupervised learning will result in creating the essence of the organization. I will not be surprised if investors in future ask for such analysis along with lifetime value of customers to determine the future prospects of the firm.

Tuesday, September 26, 2017

TRA to TAM, Which Technology Will I choose Part 1

The advent of new technologies and the sprouting torrent of startups along with the companies that are dominating everyone's leap of imagination about the promise of a great connected future where human mind will be just one piece of the intelligence puzzle supplemented by machines,  is causing quite a stir in the everyday media. As I take stock of my thoughts, I cannot help but venture into debate with myself about which technologies will become applications that will define a new era. The thoughts led to a desire to find out how we adopt new technologies and what leads people to fall in love with some "tech" while others either wither away in shelves of thrift stores or will eventually be seen in the nostalgic corridors of companies, as also ran the race stimulating conception of an idea, for their budding employees intellectual curiosity to feed upon.

In my search for how to evaluate technology I was motivated towards the theory of reasoned action which pushed me to technology acceptance model and then to the famous diffusion of innovation curve by Everett_Rogers.  If we look at the TAM in its simplest form and its main blocks, I think the media has done a good job of creating perceived usefulness for machine learning and AI. Some good PR along with applications like the self driving cars, the face recognition locks, the voice activated home assistants and self analyzing spreadsheets have prepared the users well for the perceived usefulness of AI. Such is the hype that the perceived usefulness I would say is now anticipatory usefulness for Augmented Reality and Virtual Reality. Where AR/VR does have a hurdle to cross is the perceived ease of use, another important pillar of the TAM. The attitude towards using the technology and the behavioral intention vary from generation to generation. But I don't think anyone needs to prove that the generation that was born with iPhone and Facebook will be more adaptive to always on, non private world, knowing that sharing some information is volitional but life of a digital native is very transparent to data native companies.

Point in making is that except for the perceived ease of use the rest of the pillars of TAM have actually shifted toward faster technology adaption. Along with perceived ease of use there are other variables to consider that play a critical role in forming users perception. In their paper, characteristics of innovation adaption Tornatzky and Klein point out that users differ in their perceptions as they evaluate the benefits through their particular cognitive frame works resulting in subjective conclusions which the individuals believe as their truth. In similar vein Festinger in cognitive dissonance theory argues that the users are predisposed to vote for the technology that they have already adopted therefore after the adoption, look back studies of winning technologies cannot predict the future adoption of another technology. That makes for a very interesting observation, one that I am guilty of as well and that is even before the application of the technology is really available in an application we become vocal advocates of the technology and hence its application becomes unavoidable. This is the trick that successfully works for tech giants with enough media muscle to shift perception. This doesn't work all the time, the case in point will be Google glasses. But will that be the fate of Occulus Rift, Spectacles, Microsoft VR, Google day dream. All signals point to a rapidly rising adaption cycle for many of the new applications on the horizon.

Since I have (not even) barely touched on some of the studies around technology diffusion, its adoption and human psychology, I am but a slave to my desire to explore these further and bound to notate a continuation.



Wednesday, March 29, 2017

A simple survey for robot adaption

The buzz on the AI is deafening, one need only pick a current issue of any news paper and he will be rewarded with news about some machine with learning abilities. With all the buzz around artificial intelligence it would seem that the world is ready to embrace a walking talking humanoid, with all the goodness of great service minus the emotional baggage to direct their responses, a true stoic, immune to the pangs of hidden spears in human conversation and free from burdens of emotional accommodation. One would think, but generalizing a specialized opinion to the affirmation of masses sometimes have unintended postulations, incorrect most of the time.
To get a gentle feel of how the masses think about it, I set out to perform a small (due to the out of pocket minimums I apply to such endeavors) and simple because I wanted to get a sweeping, yes and no opinion, which we are not going to generalize, but we can take a look and muse in it. If you haven't used, I will recommend trying out Google consumer surveys they are easy to setup and one can do some nice opinion gathering.

Although autonomous cars, voice activated/controlled devices and speech recognition on every smart device is a flag bearer of AI advances, I feel the true test of AI will be the robots.
The first question I asked was simply a perception of people on Robot adaption. Robots as servers is a bit easy I wanted to go bit more bullish on the AI side so I chose a more demanding profession for the robots in my survey, retail sales associate. The first question was






Asimo may be a stretch of imagination but I still expected a bit more optimism on the prospects. I also wanted to see if people were mentally prepared for dealing with a robot if they ever encountered one in sales situation hence the question


The same effect was observed for the group of high online buying adapters being more receptive to non human interaction.

  • Overall what % of your shopping is online (excluding Groceries)?

There are couple of other ways to look at the picture using the demographic filters of gender and age, or device adaption behavior of using Mobile for shopping or voice activated home devices.  One thing that I found interesting was; of the people in age group 18 to 44 that shopped 75% or more online, not one individual said they will try to find human irrespective of their gender. If I have to guess, I think robots will pop soon, but first as more static answering machines in retail and then to more interactive assistants finally to autonomous helpers. Although I won't mind walking into a store, activate a robot, that walks along my side through the isles, answers my questions, looks up inventory, gives me suggestions, brings up deals etc, I would prefer C3PO as good company but will settle for R2D2 or BB8.

Saturday, March 18, 2017

Age of connection will change information flow

The search engines might become irrelevant in the world of connected devices where the devices will talk to each other and the information on each device will be validated by the use of other device owners and by devices themselves through a mix of AI and protocols. There should be a separate protocol for devices to talk (not connect) in terms of human language and the AI that translates and rates it should emerge sooner or later. That protocol with the power of NLP and AI will basically put the power of information truly in the hands of crowds, as the compilation and ranking by few sources will become unnecessary. The race for controlling the device and information echo system will become immaterial if the AI becomes inherent in the devices, regulated by protocol and universally accepted signals for supervised learning (not owned by corporations and big information aggregators) that are neutral and publicly validated; a block chain of AI and information.

Age of information overflow started with the rise of internet, and now it has morphed into age of connections. Connecting individuals to the right information, in right context at point of need almost as fast as the thought comes to them about the need, so that it can be fulfilled. Maybe not at the speed of thought but at the speed of their connectivity.

The power of connections is not in the tribes, When one looks for information, it is almost always an individual providing the information.and more experts jump in to create alternate versions and the crowd participates to tune the information by their comments and ultimately declare a winner through vote or simple affirmations. You may call that crowd a tribe, but the same individual will be member of different tribes, where each one can be the leader of different tribe yet member of many.

I use stack overflow quiet a bit, do I know a single expert on it, maybe, but the chances are the experts I remember are far fewer than the information I have acquired through. Even after knowing a few I don't know the extent of the expertise of the contributor, but considered the nod of the crowd worth validating by using the information.

The power of platform as voice of crowd is bigger validation of the expert opinion then the opinion donor. In few cases the expert is able to transcend the crowd by sheer cult of personality in which case the combination of what and who becomes the knowledge and not just the the piece of information. Who imparts the knowledge becomes irrelevant in which forms of information? Is going to MIT more meaningful than taking all MIT courses online? The educational platforms on internet vary in their size of offering and in their value. The same knowledge disseminated by similar means has different value. But eventually that gap will subside when learning will become the true objective and verification of capability will be more quantitative than branded.

The original question how the ease of information dissemination through internet to the seeker is shaping up the future of experts. The seeker is an important part of the puzzle, because I am interested in a knowledge that needs to be disseminated. Is there a difference in how the knowledge is shared by the type of knowledge shared and the audience of that knowledge; professionals seeking professional advice, normal people seeking professional advice. How knowledge is shared through out history and how the hubs of knowledge are formed.

The libraries were the research vehicle for the past. You could walk into one and do your research on existing knowledge base. Now you can perform the same function, many times faster and instead of knowledge being the hub you are the hub. You can research that knowledge, contribute to it with your comments, become a sharing or explaining resource and that very static piece of information all of a sudden is very dynamic. The finding of the knowledge, giving it your voice and then ability to publish that instantly, that is the modern dilemma. It loads the system with redundant information but it also amplifies the information. You can be right or wrong in your editions but that will eventually get through the digestive system of the internet, munched on by the crowd if at all, spewed to the organic pile of information that gets buried in 100 page down in search results and no one will know.

Granularity and segmentation of information by the user and by the type of information that's, internet. The jazz about personalized content, and targeting is nothing more than connecting the right audience, with the precise information, in the least amount of time. To make that happen the original portal, be it web, be it smart device (imagine all smart devices, cars, watches, phones, homes etc)  where the quest for a piece of knowledge begins matters. Because the interpretation of the intent of person searching the information is the first step to even attempting to get them the right information right away. Words only tell half the story and rest of the context must come from everything else, sensors and artificial intelligence.

Relevant information was always the quest, and although the content creation is reaching monumental proportions the crowd validation along with machine learning is also breaking new grounds. In current digital realm, few big players dominate the information creation because crowds flock to them but eventually the system(any connected device) will be able to find the most relevant information from any available digital source like a certain recipe stored on a connected personal oven of an unknown chef.




Sunday, December 25, 2016

Age of computers and Data, where do Humans Stand

One thing that computers can't create is knowledge....and the real question is the human knowledge and discovery at a place where the incremental effort and value is just re-gurgitation, maybe complex combinations that appear unique but are patterns and insights about existing knowledge.

An increase in computing power and understanding of pattern recognition in human communication, be it facial expressions or speech recognition or language translation, is making emotionless machines appear cognitive. Pleasure, pain, memory and experience, combined with the contexts (surroundings, circumstances) in which all those occurred, and intensity of each experience,  along with the DNA of the individual that makes the absorption of the experience unique to the subject, eventually mold the decision making into a personal trait. In short the inputs that go into a human machine with abstractions of data signals is almost infinite. Although many particular behaviors of humans have become more predictable, at individual level maybe it is hard to guess which way a particular individual will swing unless an extensive psycho analysis experiments and results for that individual are evaluated, but at a group level probabilistic behavioral outcomes are being done in many fields.

The point in question is AI, the latest and the greatest of the current buzz words has arrived. Soon it will be difficult to distinguish if you are talking to a man or machine online or on phone (except for the very balanced voice of Karen Jacobson telling you she is recalculating, and you almost feel she is in the car). These new developments have led some very ambitious projects and the latest ones I read about is to develop management decisions by experimentation, recording and then "algorithmising". This is quiet an interesting approach, and probably will work in certain situations. Hedge funds that can crunch 150 million signals into few decisions should embark on a more precarious route of mapping human interactions to simple outcomes. The problem is that those decisions are based on infinite other "nuances" of individuals that Bridgewater probably considers noise that can be suppressed. For one thing, the decision making by computer brains is easy. It is devoid of the emotional manipulations and political consequences and hence really, shall I say boring; easy nonetheless and in some cases where very defined paths must exist, efficient. Humans are also leaving an ever increasing trail of their activities and decisions following those activities, that make defining inputs and outputs of machines more accurately "human". The (people with)machines as it so happens will always carry a temporary advantage because in hyper competitive market nano seconds can switch millions. That being said, what one can "predict" using the same algorithms using similar data points very quickly becomes available to all market participants at some cost. The game ultimately returns to humans to weigh the inputs "calibrate" and then interpret the outputs, and the betting begins again.


The digitopoly blog talks about judgement becoming the currency of the day in the machine age, and what is funny about that is judgement is always the most precious of gifts at the higher echelons of management. The conundrum is judgement skills vs prediction skills; can one really predict without judgement which is a direct result of experience (practice and observation). Prediction in this sense will be basically one more input point to the judgement, the weight of prediction on the eventual judgement in this case will be more, simply because the machine will evaluate far more inputs in a shorter amount of time, more frequently, and arrive at a logical conclusion with consistency, accuracy and speed.

The improvements in AI are an extension of computing power. Computers are leveraged to do great many tasks but depend on humans. A certain type of knowledge is being created by the immensity of data and algorithms. Unique understandings from the cross section of this immense data and algorithms/models are forming the basis of new type of knowledge. This knowledge has its own basic blocks that can form bigger themes and simple conclusions around those themes under known set of conditions, inputs. More and more humans will master this knowledge and enhancements in AI will become extensions of humans just like language. In essence AI will not be AI but EI (enhanced intelligence) for humans.