Blockchain technology is at the center of next-generation operating models.
By: Mark Green
Data governance is dead. Although its principles are more important than ever, the operating models that today’s enterprises use have become a barrier to innovation.
New applications and system integrations are beginning to produce increasingly unmanageable volumes of data. At the same time, data governance policies are becoming stricter. This environment puts a great deal of pressure on organizations to find streamlined, automated solutions for data governance.
When IT teams need to spend time, resources, and money managing data, they end up spending less time, resources, and money on new, strategic initiatives that helps customers and end-users. Organizations that adopt a disruptive, automated approach to governing their data will be prepared for tomorrow’s regulatory landscape.
In the world of data governance, there are ongoing debates about ownership, security accountability, and the proper use of data assets. Today’s tools and systems are not robust enough to handle oncoming challenges, but next-generation blockchain technology has exactly the features tomorrow’s organizations need.
Rethinking Data Governance
Data governance has organic origins. At the beginning of the Computer Era, established organizations and institutions put ad-hoc governance policies in place in order to qualify and mitigate risks. With new technology, these risks eventually compounded into the security landscape we’re familiar with today.
Governments saw these developments occurring and sought to protect citizens and consumers with various forms of legislation:
Every one of these initiatives generate a marked increase in sophistication and complexity for organizations. The latest advances in data governance policy put a great deal of stress on organizations to protect user data.
Right now, GDPR is only in effect in Europe, but California has passed its own legislation to the same effect. It is only a matter of time before every jurisdiction has its own set of robust data governance policies.
The now-mature practice of implementing expensive, time-consuming changes to organizational data governance is ripe for disruption. Organizations need a robust, agile data governance system that registers data and algorithms securely, audits them, and generates unchangeable audit logs automatically.
Steepening Privacy and Security ChallengesThe debate on personal data is ongoing. High-profile data leaks, cyberattacks, and privacy failures have forced organizations to come to terms with a more complex data landscape than ever before. There are plenty of examples to choose from:
Facebook’s Cambridge Analytica scandal offers an even more poignant example. Cambridge Analytica accessed the usage data of 87 million US residents and used that data to influence the US presidential election. The organization was able to identify people through their activity – it didn’t need names, addresses, or other unique IDs to do so.
This opened up a nationwide discussion on the ethics and ownership of personally identifiable information as well as its governance. The systems and tools that organizations use to capture this data are evolving too quickly for organizations to keep up on their own.
Not only must organizations come to terms with the ongoing debate on data ownership, but they must also account for their security practices. In Target’s case, cybercriminals accessed the company’s point-of-sale devices by hacking its heating and ventilation contractor.
Target found itself accountable for the security failures of a third party. Today’s executives running large-scale enterprises with high digital exposure have to ask themselves how many third parties they are accountable for.
Even after organizations align ownership and security accountability, there are still new technologies on the horizon that will serve to disrupt the governance landscape. Big data, cloud computing, and Internet of Things technologies all come with unpredictable privacy and security concerns.
Blockchain Offers Data Governance Agility
Organizations can’t rely on the traditional practice of deploying expensive, time-consuming governance frameworks with implementation schedules that stretch into multiple years. By the time the new system is fully integrated, the data governance environment will have changed so much that the solution in place will no longer be useful.
Lack of agility is already a well-known death knell for large organizations. Just as newspapers, magazines, and even MySpace were unable to analyze their data for user insights to keep up with a changing media landscape, so too will healthcare and financial companies lose out on integration opportunities as the data governance landscape advances.
In order to govern the increasing volumes of data while handling continuously changing analytics, organizations must implement unbreakable, automated methods for registering and controlling the use of data assets. Without a solution of this kind in place, organizations will remain perpetually out of sync with broadening data governance regulations.
Blockchain technology offers organizations the ability to disrupt the data governance landscape by automating data asset compliance. Blockchain uses a peer-to-peer security framework where each individual node on the network plays a role in auditing and verifying transactions that take place on that network automatically.
While previously limited to applications in cryptocurrency, blockchain technology has a far wider reach in terms of potential applicability. It allows users to build trustworthy platforms on agile, lightweight frameworks without the need for a third-party authority.
In the world of data governance, a blockchain platform could register all data assets and the algorithms used to analyze them in a secure, unchangeable form. It could then categorize each algorithm by use case and continuously verify the inputs and outputs in to generate audit logs that keep track of data governance metrics as they are captured.
Blockchain’s peer-to-peer network verification capabilities would ensure that data assets, algorithms, and their respective analytics are effectively tamper-proof. Placing this data in a single encrypted repository simplifies the control and tracking of data governance solutions.
Introducing Shortest Track’s Merchandise Mart
Shortest Track is currently developing a Merchandise Mart to house and run these data and governance algorithms in a secure, simplified platform. The Merchandise Mart uses artificial intelligence to determine which sequence of algorithms and data solves users’ business problems most effectively and authenticates usage with a blockchain ledger.
The Merchandise Mart allows organizations to automatically register all of their data and algorithms in a single repository. This makes tracking, analyzing, and updating them far easier, but the truly disruptive element takes this solution one step further: The Merchandise Mart allows business teams to access pre-registered analytics, linked to intelligence systems that organically respond to changing business conditions. Under this framework, innovative business agility becomes a service.
Organizations that use Shortest Track’s Merchandise Mart for data governance will be able to replace expensive, time-consuming data governance processes with streamlined solutions that are pre-registered, automated, and ready for immediate integration. This offers organizations best-in-class data agility without losing control of rapidly increasing data volume.
By: Mark Green
Next-generation data governance requires a next-generation operating model with new capabilities.
Data Governance is becoming a barrier to innovation. The central principles of Data Governance are to control what data gets used, how it gets used, and how it gets secured. This systemic control costs money. Data storage, handling, and processing needs to be decided on, documented, and locked down. And once done, the processes to consider change, adding to the costs of money and time. Then implementing that change costs even more money and time while adding change management risks as well.
As new varieties, volumes, and velocities of data escalate exponentially, pressure for new analytics and insights continues to increase disruption risks from market share to whole business platforms for companies that do not evolve fast enough. Companies need to increase their rate of innovation to stay competitive.
Intelligence automation with Blockchain in the middle provides the opportunity to manage data governance for the fast pace Digital Age. Done smartly, intelligence automation with blockchain can reduce costs, risks, and organizational friction while enabling companies to move much faster.
Data Governance is about registering data and every interaction, and then securing them so all deviations are pre-approved and audited. The current state catalogs these (static) and layers on a change management process (manual) to pre-validate and control all change. In even the most sophisticated companies we are familiar with, data governance tools tend to be stand-alone, not integrated tightly with the day-to-day use of data. Management of analytics is mostly in place but remains a business process without the ability to actively manage. The edge cases are departments supplementing systemic analytics with spreadsheet, where analytics are not locked down.
Current State Remains Rooted in the Past
Companies started data governance organically as a solution to quality risks. More recently, governments stepped in with reporting requirements like Sarbanes–Oxley Act, Basel I, Basel II, HIPAA to address accountability and protection risks. The convergence of these initiatives led to today’s data governance practices and associated industry of software solutions, consultancies, institutions, and academics.
This mature practice is now prime for disruption. The emergence of big data from an epidemic of monitoring devices is producing a flood of uncontrolled data, forcing companies to rethink Data Governance as well as the governance of the analytics and models that are increasingly running key components of our businesses.
The sharp debate today revolves around personal data. Europe is taking the lead by stipulating ownership and consent requirements with its General Data Protection Regulation (GDPR). Much closer to home, California is following. This comes after years of hacks. Random highlights include Target, Equifax, and the US Government. And there have been fumbles too, where systems were not hacked but where sloppy governance has led to massive misuse. Facebook provides a robust example. Their story starts with ignorance, they appear to be unaware how their clients’ data is used on their platform. Their CEO initially said “it is ridiculous” that anyone could think fake activity on Facebook could influence the US presidential election. Later, they disclosed that Cambridge Analytica accessed the personal data of 87 million US residents. It is still not clear what Facebook knew when, as their narrative keeps changing. Facebook maintained for much of this time that there was no data breach. The active debate points remain 1) who owns the data and 2) what does security mean?
Even after alignment on ownership and security, accountability, big data, cloud repositories, and interconnected activities are still evolving too fast to control. And pressure to act without governance will continue mounting as data volumes, their changing interconnectedness, and associated analytics increase.
McKinsey maps the level of change that data is driving by business over the past three years.
Change starts where information analysis matters most. But as sensors from the Internet of Things (IoT) become pervasive, disruption will spread.
Three anecdotes tell the story. Most companies currently only analyze 12% of the data they have. 90% of the world’s data has been created in the last two years alone. IoT will save consumers and businesses $1 trillion a year by 2022.
To govern with increasing volumes of data and continuously changing analytics, the whole of data governance must be automated. A new, unbreakable, automated way of registering and controlling the use of data and analytic assets is needed.
The digitalization of media is the easy place to find examples of these disruptions. The analytics in media relate to matching content with audience wants. The gravestones include many newspapers and magazines. Even digital companies like MySpace and Yahoo! got displaced by Facebook and Google for not moving fast enough. Amazon expands into industries on the back of analyzing and responding to more data faster than its competitors.
Healthcare and Financial services are prime for change as well. Companies with strong data governance tools and process in place are the innovation laggards. CDOs that focus on stewarding data instead of innovating through data and analytics will struggle. Strong central control often leads to small teams doing things on their own. The resulting lack of integration leads to inconsistent execution, costly manual spot checks of business analytics, confusion and increased risk. As the development and deployment of artificial intelligence with machine learning accelerate, the governance and risk management challenge compounds with static catalogs and manual processes.
Into the Future
The next-generation operational model has to register all data and algorithms systematically in an unchangeable form. The algorithms need to be registered by use case and oriented towards solutions, so the inputs and outputs in all processes that are used to provide solutions can be automatically audited as used. Then of course, the model needs to automatically keep a log of assets used for sequential tracking, including reports and who views. And this model and log have to be unbreakable, auditable, and transparent.
If all enterprise data and algorithms are automatically registered in a repository before using, that simplifies the challenge of controlling and tracking use over time. This repository provides both CDOs and business teams with a common array of pre-registered analytics for all data and analytic assets. If these assets are linked in a taxonomy that ties them to intelligence solutions on a runnable ecosystem, then analytics become easy to select, use, and track. Then if calls for requests for solutions are allowed, this ecosystem becomes organic and grows to fit the changing business conditions. Finally, if solutions to these requests are sourced in the open market, the ecosystem becomes an incubator of intelligence innovation too. This breaks the iron grip of today’s data governance while closing the gap between strategic need for governed data and the way data and analytics actually get executed by line of business teams that survive in changing times.
The Shortest Track Company is working with clients today to provide this “Intelligence-as-a-Service” ecosystem with its Merchandising Mart, where it handles registration, taxonomy, execution, requests for solutions, open market request sourcing, and usage tracking. Shortest Track puts blockchain in the middle to register, authenticate and log usage in an unbreakable ledger with an audit trail.
Re-platforming traditional data governance methods with blockchain becomes simple when the whole company has migrated to an automated registry and tracking model. Adding intelligence sourcing, selecting, and operationalization to the Mart makes use and innovation easy too.
Full automation and tracking of data and analytics allows companies to be nimble with data without losing control of how it is used.
Shortest Track is an Intelligence-as-a-Service solution company that manages and accelerates the intelligence supply chain by sourcing, selecting, operationalizing, and syndicating solutions. It is also at the forefront of Blockchain to re-platform data governance with a fully automated and scalable framework for securing data, processes, analytics, and solutions.
Complex, risky technology integrations will soon be a thing of the past.
By: Mark Green
Risk is at the heart of every capital enterprise. Someone who risks their resources investing in a potential business is ultimately accountable for the success or demise of that business.
During the industrial revolution, capital enterprise typically meant building huge factories, outfitting them with machinery, and then hiring as many low-skilled operators as possible.
In today’s business environment, things hinge less on physical labor and more on data. Today’s entrepreneurs and business leaders have to decide between a dizzying number of enterprise resources that make modern business possible.
Many of the organizations behind these resources have become powerful names in the industry: Salesforce, Oracle, Adobe, and SAP are just a few. These systems build the foundations that every large-scale organization in the world relies on for almost all of their routine tasks.
But these purpose-built technological solutions come with their own set of risks. Executives typically go to great lengths to identify and mitigate these risks using processes that may soon be made obsolete by artificial intelligence.
The Risks Of Enterprise Technology Integration
One of the major challenges that executives face is choosing the specific combination of technologies their organizations will use to meet business needs. They have to weigh the advantages and drawbacks of various third-party providers and their own in-house capabilities to arrive at a streamlined, integrated system for doing business.
These integrations tend to be incredibly complex. Not only must a broad variety of purpose-built systems do their jobs correctly; they must also connect with one another in a streamlined way.
At the same time, these technologies need to work together across a wide variety of business tasks like marketing, production, supply chain, point of sale, procurement, and finance. Often, they must interact with external ecosystems to do this.
Enterprise technology integration is so complex that most executives simply outsource it to major firms. Alternately, they may outsource the business process itself to a service vendor with readymade integration.
These are major decisions that enterprises typically cannot go back on – once you’re committed to a technological ecosystem, you have to make it work. The problem with enterprise technology management is that it forces executives to make risky long-term bets on almost every aspect of their infrastructure. This introduces a great degree of risk into the equation, as every new process integration weighs down the system, becoming an intrinsic barrier to innovations in process and intelligence usage.
Enterprise technology implementations come with significant short- and long-term costs. Deploying a comprehensive in-house solution reduces operating costs but requires massive up-front investment. Business-Process-as-a-Service solutions require no up-front investment, but end up costing similar amounts over time and include significant switching costs to move between options.
The more complex a technology ecosystem is, the harder it is to maintain interoperability within it. Interoperability failures lead to workarounds, which hurt efficiency and make the corresponding infrastructure redundant at best; obstructive at worst.
Changing processes and ownerships within an organization always carries a cultural impact. There is no guarantee that employees, managers, and directors will accommodate new processes or technologies the way enterprise technology consultants say they will, and there are plenty of examples of times they did not.
Culture rejection is a serious investment risk. Integration strategies that fail to accommodate company culture negatively impact morale throughout the entire organization, from entry-level to the executive suite.
Enterprise integrations are never future-proof. At best, they are reasonably future-resistant, but even the best projections can never predict how the organization should respond several years down the line. Building and integrating new solutions can take years, and that process doesn’t even start until the company is certain that the future is changing.
Many organizations learn about this risk the hard way as growth makes their integrations become increasingly brittle and unmanageable. P2P integrations offer a great example of how this works.
How AI Solves Integration’s Biggest Problems
The main problem with enterprise technology integration is that it is a static solution to a dynamic problem. Executives look for technology solutions to address particular business challenges, integrate them, and then hope for the best as the business system responds to these changes with a new set of challenges.
The world’s top enterprise executives recognize this and establish processes for dynamically responding to technology needs. They collect data continuously and optimize business solutions within and across data stacks, constantly testing for better, more efficient solutions. This could be called the Bezos approach, but it’s not limited to Amazon.
Artificial intelligence makes it possible for executives running organizations of nearly any size to apply the same concept. AI technology is ideally suited to the task of continuously collecting data, optimizing its algorithms, and continuously testing its results to come up with better solutions.
With an artificially intelligent integration resource on-hand, enterprise executives no longer need to think about organizing and committing to any one technology. Instead, the organization simply subscribes to specific solutions for their business challenges.
Behind the scenes, this intelligence-as-a-service (IaaS) solution would optimize within and across data stacks to find the most efficient solution for every problem at hand. The transformative part of the process is that artificial intelligence can do this autonomously without any need for additional input.
The key to making this solution work is establishing a centralized marketplace for data, algorithms, and workflow solutions for artificially intelligent integrators to browse through. This is where Chameleon Collective and Shortest Track come into the picture.
Introducing the Merchandising Mart
Chameleon Collective is helping Shortest Track launch the Merchandising Mart, a back-end ecosystem and AI front-end solution for addressing business challenges. The Merchandise Mart takes in data from suppliers and algorithms from developers and executives use artificially intelligent solutions to source, assemble, manage and operationalize the best solution for their needs. Under the hood, the algorithms are validated and benchmarked.
This performance grading of analytics at the granular level drives continuous source improvements while enabling AI assembly and optimization to automatically maintain best solutions. This becomes powerful as the assembly and optimization process includes transfer learning to leverage algorithms across all business verticals, so improvements in retail analytics immediately impact the quality of consumer product analytics.
This levels the playing field between enterprises employing data scientists – who can directly leverage the backend to optimize data within and across stacks – and those who simply need solutions directly from an integrated front-end interface. This platform makes it possible for executives in all industries to continuously monitor and update their solutions for change, performance, and efficiency. This includes leveraging practices across industries to innovate. Shortest Track has an AI front-end to explain, train, and sustain solutions to make the intelligence actionable to executives. This next-generation operating model enables companies to move and adjust fast.
For executives, subscribing to the Intelligence-as-a-Service from the Merchandising Mart ecosystem represents the best way to find answers to old and new business questions. It makes it easy to identify opportunities to increase productivity over time by automatically reviewing processes according to new contributor components.
This approach offers excellent results when applied to specific problems, while also providing a platform for addressing complex business environments. As executives adopt this next-generation operating model across multiple departments, it will provide five additional benefits:
Shortest Track’s AI-powered Intelligence-as-a-Service front-end with its AI-managed Merchandising Mart ecosystem back-end enables companies to implement a next-generation operating model today at their own speed, starting with an entry point and then migrating across their company.
By: Mark Green
A long time ago, the advertising business was simple. Advertisers hired an agency to create and place advertising and gave them 15% of their adspend as compensation.
Today, every aspect of agency compensation is negotiated. This procurement squeeze led agencies to specialize into business areas to drive profits. In media investment, they went further and built differentiated profit centers in data, analytics, and targeting. Startups provided innovations. Then the agencies and other agency-less competitors duly bought the startups to scale their capabilities. The specialization started in digital and migrated to television. According to Magna Global, 60% of programmatic digital ad spend is captured by these profit centers (25% for Data Targeting and Verification, 15% for Trading Desk, 10% for DSP, 5% for Exchange, and 5% for Agency of Record.) The percentage of television adspend captured by these profit centers is more opaque.
Not surprisingly, digital publishers wanted a piece of this pie too. Google and Facebook duly kept the data and targeting in-house, forcing advertisers to accept the publisher’s self-verification on audiences.
Advertisers traditionally leaned on agencies to explain and guide them into new methods and services, but that is now changing. The Association of National Advertisers (ANA) exposed agencies as having conflicts of interest in a report on media transparency. The US FBI is now investigating to see if there is fraud here. Not surprisingly, advertisers are becoming more involved and taking their own advice, as they realize that agencies are no longer just their agents but resellers too.
Outwardly, advertisers call for transparency. Inwardly, advertisers explore ways to transform their marketing. Many ideas are being considered: going brand direct, owning their customer data, managing it directly through customer data platforms (CDP), taking media placement in-house, and implementing iterative marketing by A/B testing everything. They grapple with several strategic questions, including: How do they organize their data and tech stacks? Most importantly, how do they get sufficient transparency to manage them?
There are three dimensions to transparency.
On method and process audits
Heeding the call for transparency, the Media Ratings Council (MRC) offers to audit various media measurement and analytic practices to ensure that they do what they claim and meet the MRC’s minimum standards. Most support the idea of minimum standards and practice audits: even though many do not do them for practical reasons.
Practice audits certify fixed methods and processes. On the practical side, this limits audits to mature practices that are no longer evolving. Companies that focus on continuous innovation and process improvements need to fix their methods and processing by stopping their innovation and improvements to qualify. Consequently, the MRC can only certify mature ecosystem practices.
The other rub is that the MRC is trying to go beyond quality standards and delve into imposing metric standards. Granted there is more prestige to be had here, but metrics is a dangerous realm for the MRC to enter. Getting alignment on metrics before market forces have spoken is messy and bad for the industry. Competing interests and digital versus television (in the case of cross media), spark fights over definitions. If resolution is found, the winning voices will come from the bigger players. Worse, defining metrics as standards means putting the MRC on a collision course with innovation.
Instead of going down the rabbit hole on designating and governing metrics, the MRC should stick to its knitting of auditing consistency and quality. Are the methods and processes of companies sufficiently documented and executed? Do these methods and processes accurately do what they claim?
Leave the metric debates to industry associations, confabs, and most importantly, market forces.
Allow reputable third party consulting practices to audit the methods and processes of companies that are evolving current or building new capabilities. This would provide faster, more economical, and independent perspectives on evolving capabilities. This is a new business opportunity since industry organizations and agencies are not independent.
On performance audits
While methods and process audits inform advertisers on the quality of practices, they do not tell advertisers anything about the performance of their campaigns.
In most cases, advertisers rely on the service companies who siphon off 60% of the money before it gets to the publisher to tell them how their campaigns performed. In many cases, advertisers do not have direct granular visibility to performance either. This is hidden behind walled gardens, analytic models, and deal terms. In many cases, the advertiser has no independent confirmation that their alleged performance is real and accurate. Examples of opacity are everywhere. Facebook restates performance. YouTube places advertisements next to terrorist videos. Attribution companies are actively reworking their models after advertisers told them that their numbers did not add up. K2 reports that some media agents are not being transparent about their deals either.
The business relations between all these companies that service the buying and selling ecosystem have too many connected business interests, in between their own secrets and proprietary methods, in order to be independent. What advertisers need are consultancies that do not contract with vendors to audit, evaluate, and report claimed performances. The reason, these contractors need to be independent and not competitive with any of the vendors, is that they will need to sign non-disclosure agreements with every vendor that they audit. And vendors will only co-operate here if the consultant has a business model that does not compete with them and has no interest to divulge their secrets.
On adspend audits
In addition to performance, advertisers want an accounting of how their marketing spend is spent.
Just as a CFO would never run a business without a Controller, regardless of certified practices, a CMO should never run campaigns without an independent auditor inspecting how the money is spent. To govern the ever-evolving complexity of marketing, the time has come for advertisers to retake control of how their money is spent.
This is the easiest of the audits. There is no judgement on the quality of methods or evaluation of performance. Here, it is a simple matter of bringing transparency to the money trail. Who received what money in relationship to the money that the advertiser spent.
Advertisers can clean up the transparency problem themselves by hiring these three types of auditors, on methods and processes, on performance, and on adspend. While consultancies may offer all three services, it is imperative that advertisers hire firms that specialize in auditing and do not sell the services being audited. Conflicts of interest are a slippery slope.
The strategic choice for television publishers has arrived. It is time to choose.
By: Mark Green
The strategic choice for television is to either compete with digital media or become digital. On the consumer side, digital delivers video on any device at any time. On the business side, digital provides addressable targeting, real-time buying, and response tracking of outcomes.
Traditional television still retains some advantages over digital. Traditional television provides fast reach with complete videos on full screens without fraud.
If television publishers view TV as Total Television instead of just linear, they can win. As this viewpoint argues, television publishers with direct-to-consumer distribution have the winning hand if they move fast.
Everyone knows that change is here.
As with any innovation, digital does not need parity with television to win. It simply needs to get enough right to gain traction, and then incrementally improve its features to compete. This may include solving for television’s advantages or simply ignoring them. The marketplace will judge which features matter with ad dollars as features improve.
Over past few years, digital has been tackling accumulating reach for video ads, reporting screen-size delivered, reporting consecutive video seconds delivered, and accounting for fraud. Digital is now working on accounting and filtering for context. Enabling brand safety is one aspect of this. Synchronizing ad messages with content for impact is another aspect. Digital is a young and dynamic medium that is evolving fast, adding new features every year.
How to transform?
Current television publishers have three major revenue sources: ad dollars, program licensing and retransmission fees. Their business model is business-to-business, selling bulk audiences. Now that digital has given consumers the power to pick and choose, the value of bulk communications is decreasing while the value of individual communications is increasing.
Television publishers need to pivot their business model to wean themselves off of the declining parts of the business and invest in the areas of growth. The simplest strategic adjustment is to change revenue and profit calculations from monetizing programs to monetizing individuals. This will drive decisions to maximize investments in the growth opportunities of digital and minimize investments in the declining businesses of bulk audiences.
Analogous direct-to-consumer businesses point to opportunities. Netflix keeps it simple: revenue and profit per subscriber. Ad dollars, coop programs, and merchandising for programs with followings could be additional revenue streams-per-subscriber for television publishers. This can apply to all content, including movies and games. For companies like Disney, the subscriber framework can extend into stores and theme parks. Disney invented this construct in the 1950s with the Mickey Mouse Club, and then went on to extend their bulk audience products (movies) into theme parks and merchandise. So a hard Direct-to-Consumer pivot can be expected to happen at Disney first. The question for the other television publishers is. Can they make the pivot too? And after transforming their US businesses, will television publishers have the vision to go beyond the US by monetizing international subscribers. Digital companies think globally. After getting organized in the US, Netflix is now going international fast.
Leapfrogging is required.
Television publishers are moving too slowly.
The move to addressable is slow because the cable and satellite companies control the majority of the traditional infrastructure and have not viewed upgrading as critical to their business. Cable and satellite companies upgrade to deliver video on demand services where profitable. However since not all homes are profitable, this build out has gone slowly over the past decade. As of last September, 50 out of 126 million US homes (40%) had addressable infrastructure. Addressable ads have been an afterthought.
Meanwhile, 89% of American adults use the internet. The opportunity to communicate directly is wide open. Inventions to fill this void are pouring in. They cover infrastructure (Smart TVs, Roku / Apple TV-type boxes, and enabler apps) and content (Netflix, Hulu, Amazon Prime, and ESPN+). Native digital content like YouTube, Twitter video, and now Facebook video are also circling with appetites for subscriber ad dollars. All these inventions are accelerating the movement from linear to digital television.
Adults 65+ with the lower internet usage at 68% has the highest concentration of linear television viewing. The next lowest age group are adults 50 to 64 with 88% internet usage. These two statistics show that even lagging adopters are moving, suggesting change will keep accelerating.
Once television publishers start selling and communicating directly with their consumers, they can solve derivative issues like ad loads with individualized algorithms to optimize consumer revenues, balancing the value of ads with subscriptions and making this a consumer choice.
Expect measurements to change too.
As digital video solves for viewability with second-by-second tracking of the ads exposure with screen size and fraud accounted for, they will increasingly create pressure for a new second-by-second opportunity to see metric for all video, including television.
This introduces challenges for traditional measurement schemes. Reporting individual seconds will require larger measurement panels. While averages work from smaller samples, individual seconds do not. Since the panel sizes necessary for individual seconds will not be affordable even when limiting the reports to demographics, hybrid techniques of census tuning panels with calibrating persons viewing panels to predict who is tuning will be required. However this solution may soon become less about delivery and more about share benchmarking if television publishers move to the ATSC 3.0 standard and orient all their delivery to a subscriber based system. This pivot would give television publishers second-by-second return path data on all viewing from all their subscribers. Measurement opportunities would then move to third party audit systems and second-by-second brand mentions, brand images, and context tracking, both in programs and ads for effective attributions.
At some point the question might also be asked: are consumers actually looking at the television every second? This applies to content and ads. The growing pervasiveness of cameras in devices suggests that this could be solved with calibrating panels also and applied to the television publishers subscriber return path data. Given sensitivities to privacy, it is unlikely that all consumers will allow themselves to be tracked by camera in the near future, hence the need for that calibrating panel.
There are two narratives for television publishers.
By: Mark Green
The heart of innovation is testing new ideas. Most often, it involves blending data, techniques, or processes to create new products. Occasionally, it is create products from scratch.
Blending is the low risk scenario. It expands the value from current costs. It has a higher success rate than building products from scratch. How many deliberate - not accidental - inventions do you know of that do not blend prior ideas?
Yet, many companies struggle with this. Large companies usually have areas of responsibilities, and no easy way to bridge these. Those who try to innovate encounter prioritization resistance in addition to fear of cannibalization.
Startups tend to be more flexible, and experimental, and yet they struggle in this area too. Often their product is new to the marketplace, and scaling involves finding repeatable business development opportunities. These opportunities require fitting the new into the old (syncing data, tech, and processes) to demonstrate value. Typically startups go through multiple rounds of these development opportunities before discovering and anchoring to what scales. This is the classic prioritization struggle with early stage startups. In reality, it is the same problem that any growth-stalled company faces: the innovation conundrum. The only difference is that bigger companies have expense decisions to make as well.
The simple way to solve this problem is have an independent team solely focused on business development.
Such a solution is where many companies get it wrong. They often think business development means a separate sales person with some product knowledge to fish for opportunities. Or just as bad, CEOs take this role and disrupt the roadmap with every pilot opportunity.
A more effective way is to start with the right team, a team that can fully develop engagement opportunities. The team needs to identify opportunities, develop pilot concepts to engage with, set up the meetings, develop the MVP (minimum viable product) for the pilot engagement, and manage the product and business sides of the engagement without involving anyone else in the company. Call it “sales with skunk works.” This what business development needs to be in order to solve the innovation conundrum.
The reason many smaller companies shy away from this approach is it is simply too expensive to retain the talent necessary to succeed. Larger companies stumble on ownership politics and culture.
Successful teams require experienced multi-disciplined experts with imagination, flexibility, and a willingness to get their hands dirty. The ideal team includes both sellers and builders led by an architect with deep knowledge and experience across current business data, tech and processes.
One simple way to solve the innovation conundrum is to hire this packaged team as a service from a company like martechpartner.com. You get ready experts who have deep experience in all aspects of marketing data, technology, and processes. You can hire their service in pieces or as a fully formed team.
By: Mark Green
Here are the topics of 2018.
We see some 2017 trends accelerating in 2018.
Surprisingly quiet are:
Still struggling towards an answer is the dominant question: