Switching to the carpool lane can be intimidating especially when you don’t have all the information available to you toRead more…
Switching to the carpool lane can be intimidating especially when you don’t have all the information available to you to safely make the lane change. For example, is the carpool lane open, is my blindspot clear to make the lane change? How do you efficiently collect bits and pieces of data at the right time to make the best-informed decision when switching data lanes?
Messages & events from cloud, IoT and Mobile devices are generated in real-time and at tremendous rates. Each of these event sources generate small pieces of data that accumulate into large volumes of data.
As events are increasingly produced and consumed outside a corporate data center and collected by IoT gateways, it makes sense to efficiently process events at the edge of the network. This concept known as “Edge Computing” enables connectivity directly to the devices or an IoT gateway via various protocols and allows for computations to be performed on streaming data such as- parsing, filtering, aggregation and delivering the data upstream to corporate data center or sending control signals back downstream to the device by the way of alerts, rules or triggers.
In any real-time analytics journey, stream processors play an important role by facilitating the two-way computing streams between the device (or the IoT gateway) and the data center and have ability to perform edge analytics. The collection of event sources is a key feature of stream processors, as it establishes data points about the source system and can describe its behavior, but it doesn’t end with stream data collection. Stream processor’s must apply basic transformation such as parsing or aggregation to data-inflight and guarantee delivery, not only back to the device but to various targets such as Kafka, Hadoop, Complex Event Processing.
Going back the carpool lane example, if the vehicle is equipped with a collision detection system, when a driver is about to change lanes, an alert is triggered if the vehicle is too close to another vehicle or object. This is a great example of edge computing as event data is immediately processed and sent back as an alert to the driver.
As you switch to the carpool lane, consider Informatica’s VDS stream processor, a distributed scalable system that collects all forms of streaming data at high rates that accumulate into large data volumes which can be analyzed and acted on while it’s still fresh and relevant. The key factor to success in delivering a real-time analytics solution is the ability to derive value from events as it happens allows for quicker reaction time that can affect the event outcome before complete.
When assessing a stream processor, look for a brokerless pub-sub messaging system that sources data from a variety of system which can scale to process millions of records per second and eliminates single point of failure, and doesn’t require intermediate storage or multiple hops to guarantee record delivery.
As your journey continues in the fast data lane, in the next blog, we will look at how stream transportation system, such as Apache Kafka, integrates into real-time data pipelines that provides a mechanism to transport data in transit.
Source: Informatica Perspectives
Dylan Jones answers the 3 biggest questions he has comes across when helping with data migrations and gives tips onRead more…
Dylan Jones answers the 3 biggest questions he has comes across when helping with data migrations and gives tips on the best way to ensure migration success.
At Informatica, we listen to our customers’ goals and then guide them towards solutions that help accelerate their digital transformationRead more…
At Informatica, we listen to our customers’ goals and then guide them towards solutions that help accelerate their digital transformation journey whilst delivering value to their business. We pride ourselves on helping customers out-anything, out-innovate, out-maneuver, and out-perform. We want the same for our partners – we want you to be Out-Standing, so to help you make your mark and showcase your credentials, we have evolved parts of our Partner Program for Consulting & Systems Integrator Partners.
Customer Success is a business imperative for us. Helping customers successfully solve those pressing and often complex business challenges by leveraging incredible technology ensures we remain relevant to their mission, and continue to be a strategic part of their future. It is often our Partner ecosystem that apply our technology and through delivery of services and solutions – help customers realize their objectives.
In order to showcase Partner excellence, we have set new objectives within our partner program which Partners need to meet or exceed to remain compliant with these new program requirements. Partners must meet specific criteria to remain an active member of the Informatica Partner Community and to be eligible for benefits and promotion within the program. These new requirements include:
- Maintaining certified experts and implementation skills in the products you intend implementing;
- Registering completed or in-progress projects as a demonstration of capability;
- Obtaining Customer endorsement and satisfaction of service capability.
Meeting these requirements enables Informatica to confidently position, promote and present Partners to customers and our Field Sales Team when they are looking for Partners to work with.
The Informatica Partner Program for Consulting & Systems Integrator Partners provides a wide range of resources, training and tools to enable the best possible experience for customers:
- Collaborative Sales Teaming with the Informatica Sales Force
- Free and heavily discounted training and enablement resources
- Financial Referral Fee Benefits
- Access to Global Customer Support (GCS)
- Sales and marketing resources
- Ability to be a Recommended Partner Advisor to Informatica Customers and the Informatica Sales Team. This is based on Informatica expertise, experience and customer success ratings.
What’s the benefit for me as a partner, you might ask? It is our intent to pass leads to demonstrably capable Partners; to advocate more for our demonstrably capable Partners; and to help create differentiation for our demonstrably capable Partners.
To be Out-Standing – you have the control to position your capabilities to the worldwide Informatica sales teams and our thousands of customers and prospects. Use this simple, yet very effective mechanism via our Partner Portal that is now a requirement for Consulting & Systems Integrator Partners, to ensure you are Out-performing your competition. Simply navigate to the ‘Sales & Projects’, ‘Register a Project’ section of the portal to begin your journey to being Out-Standing.
For more information, please contact us at Partners@informatica.com – we’d be happy to help.
Looking forward to seeing you take advantage of this great program!
The post Be Out-Standing at Customer Success appeared first on The Informatica Blog – Perspectives for the Data Ready Enterprise.
Source: Informatica Perspectives
Source: Informatica Press Releases
Source: Informatica Press Releases
This year’s Informatica World was truly eye opening. Thanks to the amazing participation and feedback from our partners, our teamsRead more…
This year’s Informatica World was truly eye opening. Thanks to the amazing participation and feedback from our partners, our teams walked away with incredible learnings about their growing successes and opportunities. And as underscored by Rodney Foreman in a recent blog post, Data Security stood out as one of the hottest topics.
Having received many requests from partners wanting more education on our data security offerings, I decided to take a deep dive into this ‘top of mind’ and relevant discussion. Using the questions raised by Rodney, I highlight below how partners can solve for clients, as well as support Informatica resources available to do so.
Visibility to Data Across the Enterprise
My customer lacks visibility into what data needs protection: That is, what’s sensitive, where it is, who has access to it, and where is it going?
A common issue around sensitive data is that it is spread across the enterprise from a variety of data sources and spread across many places, resulting in poor tracking and governance. Businesses are unable to effectively identify, measure and prioritize sensitive data risks. As companies grow, the problem compounds; data is cobbled together utilizing inefficient Band-Aid tactics and data security threats worsen.
In fact, Gartner analysts agree and state that, “data security governance and the orchestration of data security policies across disparate data silos and platforms will be critical challenges for organizations during the next decade.”
Take, for example, a large financial services firm we work with. Audits showed they lacked appropriate visibility into Personally Identifiable Information (PII) located in unexpected places, such as social security numbers in CRM note fields and/or phone number fields, making this type of data a major challenge to track. With Informatica solutions in place, this information could be discovered and tracked– alleviating concerns about visibility, risk and access to sensitive and personal information.
To help firms like the above implement effective risk management, we work with Partners, leveraging a variety of Informatica solutions. Secure@Source, for example, specifically identifies where sensitive data resides, the consumption of this data, enabling location visibility, as well as insights into risk and the proliferation of sensitive data, to deliver complete visibility to enterprise CIOs, CSOs, CFOs and any person responsible for data security.
Data Breach Mitigation
When a data breach occurs, it typically takes hundreds of days to detect it; even longer if it’s an insider threat. How do we speed that up?
Anomaly detection, which identifies high risk usage of sensitive data, is not only key to mitigating data breaches, but is critical for quickly recognizing when a problem occurs. With machine learning and AI solutions that detect abnormal behaviors, as well as high risk users and data stores, full-blown crises can be diverted!
For example, imagine an engineering intern’s daily usage increases from 1 MB to 100 MB, signaling an issue. An anomaly like this is rarely identified, let alone communicated to management, without intelligent detection and alerts in place. In this case, we would again recommend Secure@Source from the Informatica portfolio. Through its visibility into risk and anomalies, 360 degree view of sensitive data, array of dashboards, and reports and alerts it can speed up breach identification and orchestrate a response– putting businesses at ease.
My customer doesn’t know where to start remediation once a breach has occurred. And, what’s the right method of protection?
First, it is critical to have a strategy in place to proactively prevent data risks. Furthermore, your customers need the ability to resolve a data breach if and when it occurs. Policy-based detection and automated remediation is vital and must be applied to data wherever it exists, from Dev, Test, Production, Legacy Applications, Data at Rest and in Motion, Cloud, and Big Data.
For example, battle-proofing sensitive data with de-identification and de-sensitization products reduces the chance of compromised security. Informatica’s solutions around dynamic and persistent Masking are one way we offer orchestration of data security measures for limiting and preventing sensitive data access. What’s more, it blocks, audits and alerts for staff who access this information ensuring compliance with security policies, and industry and civil privacy regulations.
With the above remediation strategies in place, it’s likely your customers will be able to identify and correct problems before they result in unfavorable situations.
How You Can Drive Solutions for Your Customers
Every day we learn more about the needs of our partners who are driving data security strategies across enterprises, large or small– and we’ve built enablement tools to ensure you’re successful. We encourage you to leverage the sales tools and marketing programs.
Source: Informatica Perspectives
Designing Your Cloud Roadmap: Reference Architectures and Architectural Planning My last post took a closer look at the IT roadmap.Read more…
My last post took a closer look at the IT roadmap. For a cloud migration, a transformation around data analytics, or any long-term IT revolution, you create a roadmap to guide you from your present state to the desired future vision. But that begs a question: How do you know what that future vision should look like?
Say you’re mapping a move to the cloud from a mostly on-premises legacy. What does the best version of “going to the cloud” look like, specifically? And how do you make sure that everything on this cloud roadmap works with your larger IT infrastructure?
You can find an answer to the first question with reference architectures, and the second with architectural planning. Let’s look at both.
Leveraging reference architectures
You have to translate your abstracted, executive-level vision and strategy into concrete goals and plans. Reference architectures provide thoroughly conceived examples of what a desired end-state will look like. They’re generic— never perfect for your industry, your business situation, your needs— but they provide a great starting point, which you’ll customize to overcome your specific challenges.
Finding and selecting a reference architecture can be tricky. Industry groups or IT associations may provide them, though they’re not always up-to-date. Analyst firms sometimes have them, though usually at too high a level for roadmapping. Vendors are often the best source, in that vendors have the incentive to create and regularly update their reference architectures. But they can be the worst source, in that they tend to actually be product architectures disguised as reference architectures, and they only frame challenges around the solution the vendor sells.
As you consult with vendors, avoid inherent lock-in. I recommend starting by identifying the capabilities you need. Then identify solutions that can deliver that functionality. You’ll have a better conversation if you lead with, “Tell me how you can help me achieve my goals,” rather than “Show me what you think I should buy.”
Determine which vendor(s) can deliver the most comprehensive set of necessary capabilities. You can then weigh advantages of going best-of-breed vs. all-in with one solution provider.
Overall architectural issues
While the reference architecture helps you figure out the components of a given transformation journey, the greater architectural consideration is how the new systems and processes will work within your overall IT infrastructure.
You want to understand how new technologies will fit not only with your old ones, but with your plans to maintain, upgrade, or replace those older systems. Crucially, look at the data. The ability for businesses to leverage their data across the entire organization, combining it in new and unpredictable ways to arrive at new business insights, is quickly evolving from cutting-edge to basic requirement. Make sure that new technologies won’t create future silos that keep data locked-up, with its potential that much harder to tap.
Informatica’s Professional Services team has designed product-agnostic reference architectures as consulting tools, rather than product sales tools. We aim to help customers solve their problems, not ours, in areas like cloud, data management, and analytics. We believe in open architectures, from which you can then find the right specific solutions.
Issues such as security, regulatory compliance, business continuity, and disaster recovery should be covered in your roadmap, but you also need a holistic view of how these concerns are addressed across your entire infrastructure.
“Holistic” has become an IT buzzword, lately, and here’s another: “Synergy.” Looking at the enterprise IT architecture as a whole, are you achieving maximum benefit for minimum investment (or at least the best possible balance)?
Other organizational values/concerns for an enterprise architecture might include centering on certain IT standards, emphasizing values such as “cloud first” or “cloud only,” working in a modular/reusable way, or supporting methodologies such as Agile development or DevOps.
Transforming your perspective
Roadmapping is fun. Technologists love the puzzle of combining possibilities to create great infrastructures and great outcomes. Selling the rest of the business on these plans, however, can be more difficult. Up next here, we’ll look at how the journey to cloud requires a new look at the value of data— because what your data is worth should determine what your company invests in managing and monetizing it.
For a deeper dive into the roadmapping process, contact us about setting up one of our roadmapping workshops, which offers a hands-on look at how to evolve your organization to meet specific business and IT goals.
Source: Informatica Perspectives
Paul Malyon describes what Data Portability will mean with the introduction of GDPR, how it benefits customers and how youRead more…
Paul Malyon describes what Data Portability will mean with the introduction of GDPR, how it benefits customers and how you can prepare your data for it.
Getting data-driven digital transformation projects into production quickly requires a modern approach to data integration. Data is increasingly spread acrossRead more…
Getting data-driven digital transformation projects into production quickly requires a modern approach to data integration. Data is increasingly spread across a proliferating number of systems including multiple clouds and big data as well as existing applications and databases. Many companies struggle with digital transformation since they don’t have an organized and agile data backbone to connect all of these systems together efficiently and with data consistency. They don’t have an easy way for distributing teams to access data through self-service tools. This is where Data Integration Hub is a game changer.
Data Integration Hub 10.2 continues the evolution of Informatica’s hub for governed data integration and sharing. Building on the modern publish/subscribe hybrid architecture of Data Integration Hub, the 10.2 release adds enhancements to self-service and visibility as well as orchestration and processing efficiency.
The self-service wizard enhancements in Data Integration Hub 10.2 enable less technical users to do more themselves without requiring an IT developer. The new Hub Overview provides a visual representation of the relationships between publishers, topics and subscribers as well as lineage to provide the visibility needed to easily govern data integration. The new orchestration capabilities make it easier to automate multi-step publication and subscription processes for complex use cases.
To learn more about Data Integration Hub 10.2 and see a live product demo, join the Informatica expert team in a deep dive and demo technical webinar on Tuesday, June 20th at 8:00AM PT.
Source: Informatica Perspectives
It’s a sunny afternoon, you’re on the highway with the top down and stuck in traffic. You fumble through yourRead more…
It’s a sunny afternoon, you’re on the highway with the top down and stuck in traffic. You fumble through your navigation trying to figure out which is the best and fastest route to your destination. You can take the side streets, where you will encounter traffic lights and eventually get to your destination. Alternatively, you can switch to the carpool lane, provided you have met the requirements– if there are two or more passengers and if the carpool lane is operational, typically during peak driving hours– then you can drive the fast lane. Now you have a decision to make, do you rely on static data (what you can see in front of you) to continue your journey, take the side street for a slower journey or switch to the fast lane when it’s safe to do so.
Similarly, organizations are looking to manage growing, varied and fast data. The constant generation of data from cloud, IoT devices, social applications present an enormous opportunity for data-driven organizations to drive greater profitability, accelerate product and service innovation, and deliver exceptional customer experiences.
Traditional (batch) data processing frameworks such as MapReduce provided a “rear view mirror” into the company’s past, yielding three distinct historical data analytics: descriptive analysis, diagnostic analysis, predictive and prescriptive analysis. Analysts can query relevant historical data based on a specific period– hours, days, weeks, months or years to understand past patterns. But, data-at-rest loses value over time thus time-critical business decisions are often made using stale data.
As organizations transform to become data-driven, a modern workload approach of managing continuous data is required to perform actions on real-time data. To do so, organizations adhere to the requirements of fast data, which are velocity and variety to implement a real-time analytics solution.
Essential to real-time analytics are two components: first, stream processors continuously collect and parses data from event sources such as cloud, IoT devices, social applications as the event occurs and delivers to a streaming transport system. Secondly, streaming analytics solutions consumes data from streaming transport systems over a temporary time-based window that allow for data manipulation, enrichment, refinement and analysis, ultimately delivered for a variety of uses such as alerting, real-time visualization or persisting to Hadoop for historical analysis.
Riding the fast data lane may not be for all organizations, however as data generation increases via IoT devices and data requirements increase, understanding and implementing strategies on managing continuously generated data should be a key imperative for organizations undergoing digital transformation.
To shift to the fast lane, look into Informatica’s Streaming Solution, which allows organizations to prepare and process data in streams by collecting, transforming data from a variety for sources scaling for billions of events with a processing latency of less than a second. Informatica’s Intelligent Streaming leverages prebuilt transforms which run natively on Spark Streaming and leverages Apache Kafka as the data transport across mappings and data replay for recoverability.
In the next blog, we will discuss how you can design for the fast data lane starting with stream processors (or data collectors) and how Inforamtica’s Streaming Solutions can get you into the fast data lane.
The post Fast & Furious – Shifting from The Slow Data Lane appeared first on The Informatica Blog – Perspectives for the Data Ready Enterprise.
Source: Informatica Perspectives
Rebecca Hennessy looks at how with consumers becoming more sophisticated in their appreciation of data, there is a greater needRead more…
Rebecca Hennessy looks at how with consumers becoming more sophisticated in their appreciation of data, there is a greater need to achieve a single customer view .
It Takes Data to Delight Your Customers Digital transformation is what everybody wants, but there are many views on whatRead more…
Digital transformation is what everybody wants, but there are many views on what constitutes a “digital transformation.” Is it online services? Is it a social enterprise? Is it legacy transformation? Is it an analytics-driven enterprise?
The definition of digital enterprise may be all of the above, or maybe one of the above. What really matters in the end, however, is one thing and one thing only: whether the digital effort is delighting the customer and encourages him or her to keep on doing business with you.
It takes data– delivered at the right time, in the right way and in the right format– to be able to realize this goal, as explored in a recent report from Forbes Insights. “It is by pulling together high-quality data on customers from multiple sources, and capturing insights from advanced analytics and software tools, that customer engagement can be transformed across digital touchpoints,” the report further states “The goals of digital transformation of customer engagement— the hyper-personalization, relevancy, real-time feedback and on-the-fly agility are not attainable without access to relevant data available at the right time.”
The need is urgent, as it is estimated that demand for digital-related services will account for more than 70% of all external services growth, and within the year, revenue growth from data-based products will be double that of the rest of the product or service portfolios for one-third of all Fortune 500 companies. Many organizations are having difficulties joining this movement, however, as they may be encumbered by “the sheer volume of data in organizations that might have decades-long histories, as well as the dispersal of that data—in multiple CRMs, on spreadsheets, in filing cabinets— much of it conflicting, incomplete, inaccurate or otherwise untrustworthy.”
The Forbes report outlines three strategies that need to be the cornerstone of any digital transformation strategy:
Build a single view of your customer: A “single view of the customer” has been the Holy Grail of data, systems and application integration initiatives for many years now, seen in such initiatives as data warehousing and master data management. These efforts are required to be augmented by the wealth of external data now available. Enterprises need to start off by “inventorying and cataloging data, whether it is in a spreadsheet or a CRM system, and assessing it according to its completeness, accuracy and trustworthiness, as well as where it needs to be enriched by external data sets.” Relevant data may include purchase histories, communication preferences, purchase power, shopping behavior and social media activities. This requires synthesizing “structured data (addresses, household composition, socio-economic bracket, etc.) with unstructured data (like the free-form text of a Twitter feed). Doing so requires systems and processes that can handle both kinds of data, pulling together historical information along with the real-time data being generated by customers’ real-world activities.”
Use location data to add precision and context: “In a mobile world, it’s no longer enough to know who your customers are; you also need to know where they are in order to deliver a real hyper-personalized and responsive experience,” the report advises. Customer location needs to be captured within two contexts– their physical location and their digital presence such as social media usage. “Location helps to provide an important context by which to align data. For example, if a business experiences a sudden spike in returns of a faulty product, they may understand there is a problem but not what’s causing it. However, if they can see that the returns are predominantly confined to a geographical region, that may reveal a problem relating to temperature or humidity— two elements that may be affecting their product in a specific way.”
Create relevant communications at the right time on the right channel: As data reveals important details about customers, it’s important to then be able to engage them through multiple points of conversation. This requires “the use of data to inform personalization preferences around channel and device, to understand response rates at different times or days of the week, and how to devise an optimal channel mix for your audience.” It’s important to note that “sending information via a new channel requires more than simply digitizing it,” the report cautions. “Think about how unwieldy it is to view a PDF copy of a phone bill on a mobile screen that requires pinching and zooming to navigate it. Digital transformation requires a rethinking of the process, in a way that best serves the customer in the channel they are using. In the PDF example, that might mean prioritizing the important information— how much is owed, what the usage and due dates are, perhaps— and making interaction, such as paying the bill, frictionless.”
The post It Takes Data to Delight Your Customers appeared first on The Informatica Blog – Perspectives for the Data Ready Enterprise.
Source: Informatica Perspectives
–By Anish Jariwala and Laura Wang This is kind of big news for us, marketing operations and marketing technology types (andRead more…
–By Anish Jariwala and Laura Wang
Informatica has won one of this year’s Stackie Awards presented by Scott Brinker at the MarTech San Francisco 2017 Conference.
We love the Stackies. They’re the only award to recognize the hard work that goes into assembling and integrating a marketing tech stack– and the cool challenge of capturing that stack in a single slide.
We also love them because they raised $7,800 for Girls Who Code, a wonderful charity that I hope you’ll support. (We need more women coders!)
This year, there were 57 entries — all pretty awesome — so we’re thrilled to be one of only six winners. The stack visualizations were judged on these simple criteria:
Alignment— how well-aligned is your stack with your business
Concept— how insightful is the conceptual organization of your stack
Clarity— how easy is it for a reader to understand your stack
Design— the aesthetics of your slide and its visual appeal
Detail— more detail is generally better, within reason for a single slide
Here’s our winner:
The story our stack tells
Our entry is a snapshot of the way our customers’ and prospects’ data moves through our tech stack – centered around our marketing data lake. (Franz and Anish wrote all about the data lake in a book called The Marketing Data Lake)
The judges liked our stack diagram because, as they said:
“It shows terrific transparency for how Informatica engages with its prospects and customers, in addition to being a great example of a highly synchronized marketing stack.”
To summarize, we use our marketing data lake to integrate data from our CRM (Salesforce), marketing automation (Marketo) and marketing analytics (Adobe) into one, central store.
We then visualize the data through Tableau and started building high-value use cases— like an account-based view of our universe, for instance. Since it’s a data lake instead of a rigid warehouse, we can add more and more data sources to improve our insights and the performance of our predictive analytics (Lattice).
The marketing data lake has completely transformed our marketing. And it’s only a start.
What’s next for the lake
The next step is to create an enterprise data lake that extends beyond marketing.
We’re feeding it with data sources from across the company such as product usage data, financial data, and customer support data. The architecture is set up in a way where new data can be fed in from any source very quickly and data sets can be easily shared with all users.
The enterprise data lake will give us marketers the ability to add even more insights into our current analytics—driving better understanding of product adoption, cross-sell and up-sell. Because we can now address and leverage data generated from anywhere, not just from a marketing system, our teams can apply predictive modeling and artificial intelligence to this wider set of data to target the right message to the right person at the right time.
And as we add more data sources, the data lake delivers value to other departments too— product, customer support and finance teams will be able to dive-in and generate insights about how customers are engaging with us. A few examples:
- Product teams: will be able to see which features customers use, split by, let’s say, how they came to us. So we might learn that people who come in from a certain webinar are far more likely to use a given feature.
- The finance team: will be able to correlate product behavior with profitability.
- Customer support: will show that users of a certain product or feature are more or less likely to need help.
And that’s just scratching the surface of the surface.
The bottom line
We’re hugely excited about the potential of the enterprise data lake. And we love that we’re ‘eating our own dog food’ and showing how every company’s data carries enormous strategic value.
And, of course, we’re honored and humbled to have won a coveted Stackie Award for 2017!
Thank you to Scott Brinker (the Mick Jagger of Martech), the award judges and the whole MarTech Conference team. It’s an amazing event.
If you’re curious, here are all the Stackie Entries. An impressive collection!
The post Informatica wins a coveted Stackies Award appeared first on The Informatica Blog – Perspectives for the Data Ready Enterprise.
Source: Informatica Perspectives