Red tapism, corporate innovation and growth

Innovate or die – haven’t we all heard this? Add red tapism and innovation in the same sentence, sounds conflicting, right?

Organisations try to live up to “Innovate or die” by trying to become Agile or migrating to cloud services or adopting OKRs, basically moving in the right direction to be able to deliver better and faster. As technology advances exponentially, customers become more demaning, organisations must now be able to quickly respond to market demands in order to compete, which requires agility.

However, trying to be agile in an established, organization with legacy systems and non-agile methods is a big challenge. Untangling teams and systems that have always worked in silos is a big hurdle and the hardest part of any organisational transformation, not to forget the red tape created by outdated processes. Earlier, organisations created processes and procedures to ensure predictable outcomes, to mitigate risks. The processes designed did not have much room for experimentation or agility. But in the current digital landscape, this type of bureaucracy is simply too time consuming and not at all cost-effective.

  • Most enterprises have standardised tedious approval processes where some of the people approving do not even possess the technical know how to judge or review the matter in question. This ends up in unending rounds of justifying the simplest decision.
  • Procurement teams also add to the red tapism making it difficult for teams to acquire services and products that can speed up their development. We have all been through this – waiting months to get Slack or JIRA approved. And IT Security will not allow Trello or Google Drive, so go figure!
  • And then there is the fear of cloud solutions. There is no denying privacy is a major concern when it comes to data and customers want to ensure that the services and products they use, handle their data well. But a cloud solution provider is more likely to have robust, well-configured firewalls and data security practices than an average enterprise, as it is the focus of their business. Keeping in mind that the cost of regulatory compliance will be substantial, but the cost of non-compliance will be higher, is important while choosing cloud service vendors.
  • To top it all there is the fear of unknown, which is a huge blocker for innovation, it is therefore important to educate and get a buy-in from everyone involved on a transformation journey.

To be able to innovate, enterprises need to deliver end-to-end business value in increments, test and validate results before starting full scale development. Creating a culture of testing and experimentation demands processes and methodologies that support faster delivery of customer-centric value, with constant room for improvement.

Starting off testing a few assumptions that could lead to a potential minimum viable product should not require written approval from legal, compliance, finance, risk management, procurement, etc. Experimenation could be conducted with data masking and that should not entail long winded paperwork such as a detailed risk analysis and architectural artefacts.
Red tapism is a sure shot way to kill creativity which in turn ensures no innovation or improvement.

If organisations really want faster qualitative deliveries, freethinking leaders should not be afraid to rock the corporate boat and cut some slack in terms of obsolete processes and procedures.

Architecting Modern Data Platforms

As organisations struggle to capture and leverage multitudes of data, there is a surge of technological options to choose from. Well designed data platforms facilitate experimentation, have shorter time to markets, have faster adaptation to latest advancements in data technologies, promote self-service thereby accelerating data adoption.  Data being the key enabler for business transformations, it is vital to build platforms that accelerate validation of use cases and can handle scaling of use cases and users. Designing a platform which is elastic enough to embody all the above can be quite a daunting task.

MDA

The primary points to consider when architecting modern data platforms:

  • Customer centric

Organisations battle immensely with legacy data technologies to deliver personalization, and customer experience, despite there being so much emphasis on hyper personalization. Thinking on the lines of creating 360 ° customer view helps align technological choices after business pain points.

  • Cloud Native

Cloud solutions support elastic scaling, high availability  and secure fully managed services with integration to a range of enterprise security systems including LDAP, Active Directory, Kerberos and SAML. Cloud  solutions allow pluggable architecture – replacing components if better options are available with minimum reconstructing. Cloud platforms eliminate the time-consuming work of provisioning resources and infrastructure, thereby reducing time to market.

  • Multi-platform architectures

Be it multi-cloud or multiple data storage patters, it should be the use cases that dictate the architectural patterns and not vice versa. Datawarehouses, datalakes and NoSQL databases can all co-exist on multi-cloud platforms if the use cases demand so. Organisations should avoid platform/vendor lock-ins, because then businesses are forced to make technology choices that are not in the best interests of the company.

  • Microservice-enabled

It is critical to  envision data as not just a means for visualization like a diagnostic tool, data is critical to help organizations adapt to change, in evolving business environments and to innovate and every company wants to expedite the process to be the first ones to come up with innovative products and services. Data plays a key role in this aspect. Monolithic applications are a major bottleneck in this case. In microservices based design small decoupled services are developed completely independent of each other  to achieve business requirements, faster, generally through REST APIs or event streams.

  • Flexible

Modern data platforms should be flexible enough to accomodate rapidly evolving business requirements. Be it integrating new data sources or feeding data into futurist data products. Modern data platforms should simplify testing new ideas on a small scale prior to making heavy investments in infrastructure.

Modernization continues to be a strong trend in data platforms, whether on Hadoop or RDBMS or multi-tenant solutions. It is the ease of integrating new data sources, TCO, prototyping functionalities, security and scaling that matter most in modern platform architectures.

 

Three reasons why Big Data projects fail

technology-3200401_1280

I have not been regular with my personal blog because I have been blogging elsewhere.

Here are the links to my latest blog posts about why Big Data projects fail and how to attract more women into tech.

Having worked extensively in the Big data & IoT space I have closely observed failures over and over again and the reasons for failure being repetitive :

  • Wrong use cases
  • Wrongly staffed projects
  • Obsolete technology

Read the blog post for more details:

Three reasons why Big Data projects so often fail

Being a woman in tech or woman in data I am often the only woman in meetings, trainings and discussions which feels weird. With not many women in tech it gets easier to discriminate the few that do exist. Incidents of mansplaining, gaslighting are rampant and it’s the victim that gets labelled as drama queen while the abusers fo scot free. Organisations that are serious about increasing the number of women in tech need to address glass ceiling, gender wage gaps & bro-culture and cultivate an inclusive work atmosphere. Read my post on how to get more women into tech.

How to Get More Women in Tech

How to become big data – data analyst

Anyone who works in the tech industry is aware of the rising demand of Analytics/ Machine learning professionals. More and more organisations have been jumping on to the data driven decision making bandwagon, thereby accumulating loads of data pertaining to their business. In order to make sense of all the data gathered, organisations will require Big Data Analysts to decipher the data.

  Data Analysts have traditionally worked with pre formatted data, that was served by the IT departments, to perform analysis. But with the need for real time or near-real time Analytics to serve end customers better and faster, analysis needs to be performed faster, thereby making the dependency on IT departments a bottleneck. Analysts are required to understand data streams that ingest millions of records into databases or file systems, Lambda architecture and batch processing of data to understand the influx of data.

Also analysing larger amounts of data requires skills that range from understanding the business complexities, the market and the competitors to a wide range of technical skills in data extraction, data cleaning and transformation, data modelling and statistical methods.

Analytics being a relatively new field, is struggling to resource the market demands with highly skilled Big Data Analysts. Being a Big Data Analyst requires a thorough understanding of data architecture and the data flow from source systems into the big data platform. One can always stick to a specific industry domain and specialize within that, for example Healthcare Analytics, Marketing Analytics, Financial Analytics, Operations Analytics, People Analytics, Gaming Analytics etc. But mastering the end-to-end data chain management can lead to plenty of opportunities, irrespective of industry domain.

The entire Data and Analytics suite includes the following gamut of stages:

  • Data integrations – connecting disparate data sources
  • Data security and governance – ensuring data integrity and access rights
  • Master data management – ensuring consistency and uniformity of data
  • Data Extraction, Transformation and Loading – making raw data business user friendly
  • Hadoop and HDFS – big data storage mechanisms
  • SQL/ Hive / Pig – data query languages
  • R/ Python –  for data analysis and mining programming languages
  • Data science algorithms like Naive Bayes, K-means, AdaBoost etc. – Machine learning algorithms for clustering, classification
  • Data Architecture – solutionizing all the above in an optimized way to deliver business insights

The new age data analysts or a versatile Big Data Analyst is one who understands the complexity of data integrations using APIs or connectors or ETL (Extraction, Transformation and Loading), designs data flow from disparate systems keeping in mind data security and quality issues, can code in SQL or Hive and R or Python and is well acquainted with the machine learning algorithms and has a knack at understanding business complexities.

Since Big Data and Analytics is constantly evolving, it is imperative for anyone aiming at a career within the same, to be well versed with the latest tech stack and architectural breakthroughs. Some ways of doing so:

  • Following knowledgeable industry leaders or big data thought leaders on Twitter
  • Joining Big Data related groups on LinkedIn
  • Following Big Data influencers on LinkedIn
  • Attending events, conferences and seminars on Big Data
  • Connecting with peers within the Big Data industry
  • Last but not the least (probably the most important) enrolling in MOOC (Massive Open Online Course) and/ or Big Data books

Since Analytics is a vast field, encompassing several operations, one could choose to specialise in parts of the Analytics chain like data engineers – specializing in highly scalable data management systems or data scientists specializing in machine learning algorithms or data architects – specializing in the overall data integrations, data flow and storage mechanisms. But in order to excel and future proof a career in the world of Big Data, one needs to master more than one area. A data analyst who is acquainted with all the steps involved in data analysis from data extraction to insights is an asset to any organization and will be much sought after!

Four steps to becoming a Data-Driven organisation

screen-shot-2016-11-23-at-22-42-36

Not a day goes by when our LinkedIn news feed is not flooded with the mentions of AI and Machine Learning benefitting and changing the ways of mankind, like never before. This hype surrounding AI, Machine learning has resulted in most organisations jumping on the bandwagon without proper evaluation. A couple of years ago, the term Big Data enjoyed a similar hyped status but it has been losing it’s lustre to all the talk about AI and Machine Learning, lately.

The truth, however, is that, AI and Big data need to coexist and converge. Merely collecting and storing data in huge amounts will prove futile, unless AI and Analytics are used to generate meaningful insights that help businesses, enhance customer experience or increase revenue influx.

Making an organisation Data-Driven will take time and will happen in stages. While there are no sure shot ways to create a Data-Driven organisation, below are some ways that could lead to a change:

  1. Strategy – It all starts with a clearly defined strategy in place, stating the Whys, Hows, Whos and Whens. A clear strategy helps in raising awareness across the organisation, about the topic in focus (data in this case) and creates a sense of urgency around the change process. It is imperative that the entire organisation understands the importance and implications of a data-driven organisation, thus encouraging people to update their skill sets and raise their level of data awareness. An all round data strategy should not only include the technology required for execution but the kind of competence and people skills and the sort of conducive atmosphere required for a data-driven organisation to thrive.
  2. People – Just as there are different kinds of skills required within a Marketing or a Software organisation, there are different skill sets for the different job roles within a data organisation. But due to the hype surrounding Machine Learning and AI while companies lack the practical knowledge in data know-how, the tendency is to either hire the wrong people or assign the wrong tasks to the right people! Not everyone has to be a data scientist in the data organisation. There will be people required to work on data architecture, data infrastructure, data engineering, data science and the Business Analysts. These could very well be the same person, if the organisation is lucky enough. But it is unfair to hire a data engineer and assign him/her the task of building Predictive models or hiring a data Scientist to be told to develop BI reports. Strategists will have to spend the time required to understand the nuances of skills and expertise required in a data organisation but it will be worth it, to retain and grown the talent pool required for a Data-driven organisation.
  3.  Patience – Creating a Data-driven organisation will require ample amounts of patience and perseverance. If data has not been involved in the decision making process, earlier,  then the data is most probably not in a state that can be used readily or maybe there is no or not enough data to begin with! In that case, it has to start with gathering the data required to achieve the business goals. Transaction systems have a very different database design than the data storage mechanisms used for Analytics purposes, which entails a design and architecting process before being able to analyse the data. Moreover, as Analysts dig into the transaction data, they surely will encounter non-existence of relevant data, data retrieval issues and unearth data quality issues and data integration problems due to the existence of data silos. In a data-driven organisation, all data sources are integrated to provide a single enterprise version of truth, irrespective of Customer data or Sales or Marketing data. A data platform, integrating all business data sources, ensuring quality and data integrity and security is a time-consuming process. Organisations will have to take this lead time into consideration when strategizing a Data-driven decision making approach.
  4. Organisational Culture – The purpose of a Data-driven organisation is to empower employees by means of data and information sharing to enable the organisation to collectively achieve the business goals. This approach requires employees to be data aware and not use gut feelings to make decisions and this could be a whole new approach for many. This new way of working requires organisational change management, educating people to use facts and figures to arrive at conclusions and make decisions. If an organisation is fairly data aware, in the sense that metrics are used to measure certain processes, in order to turn Data-driven , the organisation has to take steps to use data proactively (read Predictive Analytics) and not just summarise events that happened. The CDOs/ CMOs need to drive data awareness by showcasing quick wins and success cases of Data-driven approaches, as a means to use data as the foundation in every decision making process.

Some organisations may take longer to implement a Data-driven culture than others but there is no way an organisation can become Data-driven, just like that, one fine day! If the CDOs can gauge that the organisation has a longer incubation period then it is good to start with raising data awareness and introducing a BI/ Datawarehousing team. It is not recommended to directly leap on to AI, hiring data scientists, to be then left in a lurch if the organisation and the infrastructure are pretty rudimentary to handle their expertise.

A Data-driven organisation culture starts with the right strategy in place, followed by the right people and technology, evaluating and optimising the entire process, intermittently.

Continuous delivery of Analytics

Screen Shot 2016-09-18 at 16.48.40.png

 

I am biased towards Analytics not only because it is my bread and butter but also my passion. But seriously, Analytics is the most important factor that helps drive businesses forward by providing insights into sales, revenue generation means, operations, competitors and customer satisfaction.

wud-slovakia-2015-datadriven-design-jozef-okay-8-638Analytics being paramount to businesses, the placement of it is still a matter of dispute. The organisations that get it right and are using data to drive their businesses, understand fully well that Analytics is neither a part of IT nor a part of business. It is somewhere in between, an entity in itself.

The insights generated from Analytics is all about business drivers:

  • Performance of the product (Product Analytics)
  • How well is the product perceived by customers (Customer Experience)
  • Can the business generate larger margins without increasing the price of the product (Cost Optimisation)
  • What is the bounce rate and what causes bounce (Funnel Analytics)
  • Getting to know the target audience better (Customer Analytics)

While the above insights are business related and require a deep understanding of the product, online marketing knowledge, data stickiness mastery and product management skills, there is a huge IT infrastructure behind the scenes to be able gather the data required and generate the insights.

To be able to generate the business insights required to drive online and offline traffic or increase sales, organisations need to understand their targeted customer base better. Understanding customer behaviour or product performance entails quite a number of technical tasks in the background:

  • Logging events on the website or app such as registration, add to cart, add to wish list, proceed to payment etc. (Data Pipelines)
  • Having in place a scalable data storage and fast computing infrastructure, which requires knowledge about the various layers of tech stack
  • Utilising machine learning and AI to implement Predictive Analytics and recommendations
  • Implementing data visualisation tools to distribute data easily throughout the organisation to facilitate data driven decision making and spread data literacy

As is the case, Analytics cannot be boxed into either Tech or Business. It is a conjoined effort of both business and tech to understand the business requirements and translate the same into technically implementable steps. Many organisations make the mistake of involving Analytics at the end stage of product or concept development, which is almost a sure shot fiasco. Analytics needs to be involved at every step of a product development or customer experience or UX design or data infrastructure to make sure that the events, the data points that lead to insights, are in place from the beginning.

Delivering Analytics solutions is a collaborative effort that involves DevOps, data engineers, UX designers, online marketeers, social media strategists, IT strategists, Business Analysts, IT/Data architects and data scientists. A close co-operation between tech and business leads to continuous delivery of smarter and faster automations, enhanced customer experience and business insights.

Build. Measure. Evaluate. Optimise. Reevaluate.

 

 

AARRR Metrics for a Fintech

businessmen-1039900_960_720

 Lets assume this is a case study for a Fintech company’s KPI definition.

Company X is a Fin Tech company providing payment solutions to SME and small businesses via mobile app, card reader and NFC. Company X solutions provide bookkeeping and analytics features to its customers by means of tracking its product usage and events.

Tracking mobile app usage and web sites are done by using web and mobile analytics tools such as Localytics, Flurry, Google Analytics, Tealium, Xiti etc. But in some cases the data from the analytics tools are not enough to deduce conclusions and hence require additional data from various systems such as CRM, Financial transaction systems, CMS and inventory control systems. Due to the need for blending data from disparate systems, a data strategy needs to be defined and a robust and scalable data architecture needs to be in place.

I would like to provide two relevant blog posts from my own blog that point to the concepts of growth hacking and data blending.

Data Value Chain

Growth Hacking

KPIs

Data monetization for the growth of businesses, entails tracking user behavior both online and offline to optimize products and processes. A list of KPIs or metrics to measure product usage and means of revenue generation are used as a guideline for data monetization efforts. Whether it is to assess global performance of a site, measure the impact of a specific campaign or product feature change, a set of indicators will be needed to focus on the changing parameters.

There are 5 metrics defined by Dave McClure : Acquisition, Activation, Retention, Referrals, Revenues or AARRR also known as the pirate metrics that serve as a good indicator of business growth.

For each of the metric area there are several KPIs defined. For each of the KPIs there are again 4 essential components or ways of analyzing:

  • Data points – Data points are the points in the app or site that generate interesting insights about the business in question. It could be individual features in the product or events.
  • Funnels – Setting up funnels ensures tracking all the steps that lead to completion of a particular process on the site or app like tracking steps that lead to an online payment page or the steps that lead to a signing up for a newsletter.
  • Segmentation – Segmenting the potential and existing customer base to be able to understand their wants and needs in order to be able to serve them better, which is a means of revenue generation. Segmentation can be
  • Behavioral – Users who spend lot of time on the site or app, frequently login or rarely login, browsers, visitors that leave without making purchases or visitors that make purchases
  • Technical – The browsers used, the OS versions, devices used and if the users have saved the site as a bookmark or enter the site through search engines or social networks
  • Demographics – Clustering users based on their age, gender, location etc.
  • Cohorts – Cohorts are also a type of segmentation but more from a time series perspective to be able to compare data sets at different points of time. For example checking trends or shopping behaviors at different points in time.

The pirate metrics for product usage can be broadly classified as below:

Acquisition

The process of acquiring customers, which would mean tracking new customers that visit the site or download the app or search the product. The KPIs for acquisition would include all the metrics that indicate a growth or changing trends:

  • Number of unique visitors
  • %mobile traffic
  • %web traffic
  • % traffic from social networks
  • % traffic from search engines
  • Number of app downloads
  • Visit trends
  • Page view trends
  • App Download trends
  • New User Account Creation Rate
  • Bounce Rate
  • Funnel analysis for conversion
  • Number of new customers in the last Month/Quarter
  • Number of new customers YoY growth
  • Campaign effectiveness – measuring the number of customers signing up or deregistering

 

Activation

When the users have logged in and have started using the product, the usage needs to be tracked to be able to further develop the product for better customer experience.

  • Page views
  • Time spent on the site
  • Hourly traffic
  • Seasonal traffic
  • Monthly Active Users
  • Number of paying customers in the last Month/Quarter
  • Number of paying customers YoY growth
  • Type of payments
  • Types of Merchants (small/SME/seasonal)
  • Types of businesses/industry
  • Type of most sold items
  • Customer Segmentation (Technical, Demographics, Behavioral) to understand customer’s need to use the product to improve product development

 

Retention

Retention is the process of retaining existing customers by continued service leading to customer satisfaction. Measuring the factors that lead to retaining customers is a good indicator.

  • Number of returning customers
  • Average time for transaction
  • of transactions
  • Transaction failure rate
  • Number of transaction per payment type
  • Peak hour
  • Peak Season
  • Types of Merchants
  • Average revenue per Merchant
  • Average Revenue per Merchant per branch/Industry type
  • Average time taken for deposit to merchants
  • Competitor Analysis through web/Facebook crawling
  • FaceBook engagement (Likes, Shares, Comments) per Month/week
  • Number of Complaints per category of complaint type
  • App Store Ratings/Review trends
  • Text Analysis for tweets/ Facebook comments
  • Number of cash payments Vs Card payments

 

Referrals

When the customer satisfaction index is high, the customers refer the products to others thereby acting as brand ambassadors. Referrals are a means to measure customer satisfaction because customers refer the product only when they are themselves happy with the product usage.

  • Number of visits coming from social media
  • Number of site entry from Facebook ads
  • Number of shares on Facebook
  • Text analysis of tweets and Facebook messages

Revenue

One of the most important part of a business is revenue generation as revenue is not only the sustenance factor but an indicator of growth.

  • Total Payment Volume
  • Total Net Revenues
  • Transaction losses
  • Net revenue YoY growth
  • Net revenue YoY growth per type of business
  • Net Revenue per type of card (Master/Visa)
  • Sales turnover of customers
  • Number of transactions per Month/Quarter
  • Number of transactions per type of business
  • Number of transactions per Location
  • Net revenue per platform (mobile app {ios/Android/ipad}/ card reader/NFC)
  • Net revenue per type of merchant
  • Average revenue per client
  • Average value per transaction
  • Peak volume of transactions per hour
  • Peak volume of transactions per hour per location per type of business (to be able to suggest to similar merchants about the optimum time and hour of transaction)
  • %churn
  • %churn per type of merchant/type of business/Month/Quarter
  • Average Selling price per type of Merchant per type of business
  • Average Selling price per type of Merchant per type of business trends – Monthly/Quarterly/Seasonal
  • Number of customers that have applied for loan
  • Type of customers (business/demographics) that have applied for loan via Company X

 

Conclusion

Product usage tracking to improve the overall product features and outreach is an iterative process involving several processes like continuous A/B testing, UX strategy, Analytics, ideation and product development. In order to create state of the art products, Company X needs to know who their audience is and how the product will make it easy for businesses to sell. By tracking product usage, the aim should be to learn deeply about the customers’ needs and behaviors to be able to generate great solutions, proactively. Iterating towards the solution that creates the most value by collecting and analyzing data is the key.

Free Wi-Fi a boom for retailers?

happiness-is-free-wifi
Image courtesy http://www.curtincollege.edu.au/blog/

With the number of smart phone users on the rise every minute,consumers having more choices than ever, businesses have got to get innovative in order to attract new and retain existing customers. On a recent trip abroad, where I had my data roaming turned off, I realised the importance of retailers offering free Wi-Fi! This got me thinking about the ways in which free Wi-Fi could boost sales and increase customer engagement.

  • Free Wi-Fi sure drives traffic! Consumers would throng to retail outlets offering Wi-Fi availability. spend more time in stores,  which could lead to conversion. On the contrary offering no Wi-Fi could drive traffic away.
  • Having access to internet is a way to quickly check products on offer in the store, finding online discount coupons that can be encashed at the store, try out products in the store but order similar products (in variations) online thereby reducing bounce rate and comparing prices online. All of this leads to an overall better consumer experience and boosts customer retention.
  • Consumers act as brand ambassadors on social media liking, sharing and checking in at the retail outlets. The number of check-ins at a particular store speaks about it’s popularity, the same applies to consumers sharing and complimenting products on offer in the stores, on facebook, twitter and instagram. Consumer referrals are a great way of attracting more traffic to both the online and the physical stores.

The crux however lies in the easy and quick connectivity. If the retailers boast about free Wi-Fi but have a cumbersome process connecting to the hotspot then this could actually backfire.

A great mobile reception and easy connectivity to Wi-Fi – happy customers & better sales!

Programmatic Conversion

Programmatic marketing involves data driven insights to convert prospects into customers. There is more than meets the eye in the case of conversion rate optimization. Some of the deciding factors for conversion are UX design, the landing page, the source of web traffic, content, competitive price of products, good will, social media marketing, effective campaigns and customer engagement. Programmatic marketing entails analsying data at every customer touch point and targeting the consumer with compelling, preferably  personalised, offers. Conversion is not necessarily making a customer shell out money, it could be interpreted as winning customer loyalty by means of signing up for newsletter, downloading whitepapers or trial versions of the product or spending considerable time on the site. This loyalty, in the long run, could result in big wins through persuasion in the form of emails, SMSs, direct contact and targeted recommendations.

Channelizing data about prospects – online behaviour, previous shopping, socio-economic segmentation, online-search, products saved in the online basket, in other words getting to know the customer better to be able to suggest meaningful differences in people’s lives through the products on offer, results in higher conversion rates. It is here that digital convergence is of paramount importance. Digital convergence blends online and offline consumer tracking data over multiple channels to come up with targeted campaigns. Offline tracking through beacon technology is catching up. It is a win-win solution for both the retailer and the consumer providing each with useful information, the consumer, with an enabled smartphone app within a certain distance from the beacon, recieves useful and targeted information about products and campaigns and the retailer gathers data about consumer shopping habbit.

The online experience can be enhanced to reduce the bounce rate by incorporating some of the following design thoughts:

  1. Associative content targeting: The web content is modified based on information gathered about the visitor’s search criteria, demographic information, source of traffic, the more you know about the prospect, the better you can target.
  2. Predictive targeting: Using predictive analytics and machine learning, recommendations are pushed to consumers based on their previous purchase history, segment they belong to and search criteria.
  3. Consumer directed targeting: The consumer is presented with sales, promotions, reviews and ratings prior to purchase.

Programmatic offers the ability to constantly compare and optimize ROI and profitability across mulitple marketing channels. Data about consumer behaviour, both offline and online, cookie data, segmentation data are algorithmically analyzed, to re-evaluate the impact of all media strategies on the performance of consumer segments. Analyzing consumer insights, testing in iterations, using A/B testing contributes to a higher conversion rate. Using data driven methods to gain a higher conversion rate is programmatic conversion and it’s here to stay.

Intelligence Of Things

IoT
IoT

IoT – Internet of things, is the science of an interconnected everyday life through devices communicating over WiFi, cellular, ZigBee, Bluetooth, and other wireless, wired protocols, RFID (radio frequency identification), sensors and smartphones. Data monetization has lead to generating revenue by gathering, analyzing customer data, industrial data, web logs from traditional IT systems, online stream, mobile devices and sensors and an interconnection of them all, in other words, IoT. IoT is hailed as the new way to transform  the education sector, retail, customer care, logistics, supply chain and health care. IoT and data monetization have a domino effect on each other which generate actionable insights for business metrics, transformation and further innovation.

The wearable devices are a great way to keep tab on patient heart rates, step counts, calories consumed and burnt. The data gathered from such devices are not only beneficial for checking vital signs but also can be used to scrutinize effectiveness of drug trials, analyzing the causes behind the way body reacts to different stimulus. IoT in logistics, by reading the bar codes at every touch point that track the delivery of products, comparing the estimated with the actual time of delivery, analyzing the reasons causing the difference can help businesses bolster better processes. In Smart buildings, HVAC (heating, ventilation, air conditioning), electric meters, security alarm data are integrated, analyzed to monitor building security, improve operational efficiencies, reducing energy consumption and improving occupant experiences.

IoT is expected to generate large amounts of data from varied sources  with a high volume and very high-velocity, thereby increasing the need to better index, store and process such data. Earlier the data gathered from each of the sources was analyzed in a central hub and communicated to other devices, but the IoT brings a new dimension called the M2M (machine to machine) communication. The highlights of such M2M platforms are

  • Improved device connectivity
  • API, JSON, RDF/XML integration availability for data exchange
  • Flexible to be able to capture all formats of data
  • Data Scalability
  • Data security across multiple protocols
  • Real-time data management – On premise, cloud or hybrid platforms
  • Low TCO (total cost of ownership)

The data flow for an end-to-end IoT usecase entails capturing sensor-based data using SPARQL for RDF encoded data from different devices, wearables into a common data platform to be standardised, processed, analyzed and communicated further as dashboards, insights, as input to some other device or for continuous business growth and transformation. Splunk, Amazon, Axeda are some of the M2M platform vendors that provide end to end connectivity of multiple devices, data security and realtime data storage and mining advantages. Data security is another important aspect of IoT, adhering to data retention policies. As IoT evolves, so will the interconnectivity of machine-to-machine platforms, exciting times ahead!