Buying your first home can seem at times like climbing a particularly steep hill – daunting, confusing and with several pitfallsRead more…
Buying your first home can seem at times like climbing a particularly steep hill – daunting, confusing and with several pitfalls along the way. Prices are still rising, with the average UK first-time buyer home now costing £184,973, 7% up on that of a year ago1.
And finding the money for a deposit without help from the Bank Of Mum And Dad can be a real challenge – the typical first-time buyer deposit is now £33,222 – that’s 133% of an average salary1. The average first-time buyer borrowed 3.49 times their income, and the average first-time buyer loan was an estimated £136,0001.
But with a few simple steps to prepare yourself financially, and make lenders see you in a positive light, you could approach buying your first home with a lot more confidence.
Show lenders you can manage credit well
Try to make yourself attractive to lenders – they’re not looking for anything out of the ordinary, just evidence that you’ll be a reliable & responsible borrower. They may want to see that you’ve kept up to date with credit payments, the total level of credit that you already have, and how much of that you’re using. So showing lenders that you can manage credit accounts such as credit cards, mobile phone contracts and even some utility services could really help your case.
Choose the right mortgage
Fixed or variable? Repayment or interest-only? The type of mortgage you choose will make a difference to the amount that you repay every month, so you need to think it through carefully.
You can compare mortgages with Experian CreditMatcher. We are a credit broker not a lender, working with selected lenders.†
There are also specific schemes for first-time buyers including shared ownership and the Government’s Help to Buy loan scheme. Mortgages for key workers, such as nurses, are another option.
Look at affordable home-buying schemes
- Shared Ownership. This is where you buy a share of a home – usually a minimum of 25% – and pay rent to a housing association on the rest. So, you need only get a mortgage and have a deposit on the share of the home’s value you’ve bought. In most cases only 5% deposit of the share you’re buying is required as a minimum. You can usually then buy further shares up to the full 100% of its value. There are conditions so make sure you read about it here.
- Help to Buy ISA. If you can save up to £200 a month, the government boosts your savings by 25%, up to a maximum of £3,000. Some financial advisers favour the Lifetime ISA, announced on 6 April, which has a higher annual maximum contribution (£4,000) and gives interest from day one, with a bonus of 25% at the end of the tax year. The Help to Buy ISA needs as little as a 5% deposit, with an equity loan or mortgage guarantee. You can read more about Help to Buy ISA here.
See your credit report before lenders do
Check your Experian Credit Report is fit & ready before applying for a mortgage, as lenders use credit information, along with information on your application form and elsewhere, to make their decision. It also gives you the chance to check the information on your credit report is correct and up to date.
Get your credit report in shape
Simple steps could help, like getting on the electoral roll, not missing any credit payments and not making new credit applications in the six months before you make your mortgage application.
Get budgeting and financial planning
Mortgage affordability rules mean that lenders take into account not only how much you are earning, but how much you are spending. Basically, they want to know you’ll be able to afford your mortgage payments, especially if your financial situation changes – for example if you change your job or have children, or if interest rates go up. So if you can, try to cut back on outgoings you don’t need, clear any overdrafts and try to put a little aside each month.
Put enough cash aside for the extra costs
Don’t forget that as well as saving enough for a deposit, there are other costs such as stamp duty, surveyor’s fees and legal fees. There may well be valuation costs and administrative charges to think about, so try to put enough cash aside for the extra costs.
†Experian acts as a credit broker and not a lender in the provision of its credit cards and personal, car finance and guarantor loans matching services, meaning it will show you products offered by lenders and other brokers.
Experian acts independently and although CreditMatcher shows products for a range of lenders and other brokers it does not cover the whole of the market, meaning other products may be available to you. CreditMatcher services are provided free however we will receive commission payments from lenders or brokers we introduce you to. For information about the commission we receive from brokers for mortgages and secured loans click here.
CreditMatcher is provided by Experian Ltd (Registered number 653331). Experian Ltd is authorised and regulated by the Financial Conduct Authority (firm reference number 738097). Experian Ltd is registered in England and Wales with registered office at The Sir John Peace Building, Experian Way, NG2 Business Park, Nottingham, NG80 1ZZ.
Copyright © 2017, Experian Ltd. All rights reserved
The Informatica team is very excited to celebrate the 10th birthday of Informatica Cloud. During the past decade, we’ve workedRead more…
The Informatica team is very excited to celebrate the 10th birthday of Informatica Cloud. During the past decade, we’ve worked very hard and have had a relentless focus execution to achieve a singular mission – to help our customers bridge the cloud and on-premise worlds seamlessly.
To mark this milestone, let’s drive down memory lane and reminisce about some of the great cloud achievements and milestones we have hit. There is no better person to do this with than Informatica’s Ron Lunasin – one of the most popular veterans at Informatica and founding member of the Informatica Cloud team. And, guess who had the privilege to interview him ? One of the newest members of the Cloud team, me!
Ron Lunasin’s Interview by Anwesa Chatterjee – Informatica Celebrates 10 Years of Cloud Integration!
In case you don’t know Ron,– he is Informatica’s fifth employee and founding team member of the Informatica Cloud team. Ron has been with Informatica for 20 years and frankly it is hard to believe, as he doesn’t look a day over twenty. Informatica has kept him looking great throughout the years!
Anwesa: You have been at Informatica for 20 years, tell us more about your journey at Informatica? How did it all start? What was your role?
Ron: I started as an intern in June 1995 where I helped write the first Sybase driver. The team was very small and located out of two small rooms in a little office in Menlo park. In addition to being as one of the early employees, I was one of the founding members of the Cloud team in September 2005. At that point, we were called the On-Demand team. I have spent more than a decade with the Informatica Cloud team in various roles – Product Management and Field Solutions.
It has been an exciting journey and we have proved a great deal to ourselves, our customers, our partners and shown the competition a thing or two. We started out as leaders for the on-premise world and are now leaders for cloud too, as recently called out by Gartner in the iPaaS Magic Quadrant! Our initial focus was on our partnership with Salesforce, they were the cloud leader then and still are today. We started out with solving the connectivity challenge across on-premise and Salesforce. this was mainly done to help existing PowerCenter customers implement Salesforce and make the transition from on-premise CRM systems. Our initial focus was on augmenting the best-in-class on-premise solution with the connector and selling it via subscription.
Soon, we realized that the cloud world is quite different. Users were very different from the usual integration specialists we knew and so were the use cases. The Salesforce admins were using light weight integrations and data management on their own and needed to be empowered with the right tools. The existing on-premise product was not suitable. We needed to do something different and we realized that very quickly. We built an on-demand product- what we call Informatica Cloud today, ground up for true multi-tenant cloud architecture. This product helped support our customers with the same speed, agility, and scale that any other true cloud vendor would, like Salesforce, Amazon etc.
We launched the beta in 2007 and GA in 2007. There were few small vendors in this space as well as a handful of other ESB/EAI focused vendors who were re-purposing the existing on-premise products by hosting on cloud platforms like Amazon. We tasted early success as we embraced cloud full-on by offering product free-trials and conveniently priced solutions.
Very soon, we reached to being #1 most popular offering on Salesforce ApExchange. This was a major achievement since it was customers who were seeing great value in our solution. Fast forward 10 years and today we are a clear leader in the cloud space with the most innovative solutions to help our customers make the journey to cloud.
Anwesa: What keeps you excited about Informatica? What is the one thing that makes you keep going here?
Ron: It is the company’s culture and ability in reinventing itself to be the leader every time. From a five-person company to more than 3,600 employees today, the company has maintained its culture, agility and focus intact. That’s what keeps me driven to be a part of the next wave of innovations at Informatica.
In the earlier days, the team recognized that the opportunity with cloud was going to be huge, a lot of credit goes to our previous CEO, Sohaib Abbasi. At his first company all-hands meeting, he showed a slide in his presentation which highlighted the data tsunami coming and how the on-premise data behind firewall needs to integrate to the growing cloud sources. He gave an insight to growing challenges around data integration and data management across these two worlds for our customers. We kept executing on the vision with that goal of solving those challenges. Many team members from those earlier days like Sanjay Krishnamurthy, Pinaki Mukherjee, and others helped drive the initial innovation for Cloud.
The 10-Year Journey of Informatica Cloud Integration
Informatica Cloud grew from being an ideal self-sustained start-up incubated inside Informatica into the leader of iPaaS in 10 years. At first, we had our own sales, marketing, customer success, support and development teams. We were the first team to have customer support managers to support customers with all their needs after they have deployed, and this was back in 2007. We adopted a true cloud culture by trying to learn from pioneers like Salesforce. We embraced Cloud in the early days.
During the first few years, we had some great early success and we were thinking of spinning it off – the “Innovators Dilemma” was on top of our minds. We knew it would catch up and proving ROI would be hard. We were generating fast revenue for sure and growing at a huge rate, but not enough to validate the spend on engineering.
By going private, we did a reverse spin-off and that proved to be very successful for us. As a private company, we have the capability to embrace the market shifts and innovate fast bringing what the market and our customers require. It would have been more challenging as a public company.
Anwesa: When you started the Informatica Cloud team, could you have in your wildest dream seen this to be so successful?
Ron: I actually thought we would be successful from the very early days. We continue to execute on a plan we created 10 years ago without needing to divert in any significant way. Everything we did from strategy and product standpoint – we thought of robust cloud architecture and business model ground up, worked perfectly. I’m not at all surprised where we are today, as a leader in Enterprise iPaaS and being the leading Cloud and Hybrid Integration and Data Management company. I envision Informatica being synonymous with the cloud and we probably will not need to call Informatica Cloud as a separate product anymore. I can envision in the near future every product that Informatica offers would be cloud-based to help our customers ride the next wave.
As far as my journey goes, I see myself sitting on a beach, and feeling proud to be a part of this exciting adventure and my part of this huge opportunity that stands in front of us today.
Join us at Informatica World for help along your Journey to Cloud
Source: Informatica Perspectives
Companies are modernizing the way they do business to compete in a digital world. Virtually every business, in every industry,Read more…
Companies are modernizing the way they do business to compete in a digital world. Virtually every business, in every industry, is in some stage of digital transformation. By the end of 2017, digital transformation will be the central strategy in at least two-thirds of Global 2000 companies, according to an IDC study. And none will find the path forward obvious or easy.
At Informatica we find ourselves at the center of digital transformation strategy as we help our customers navigate and solve critical challenges. We have also seen how difficult the journey can be. And for every company that has earned some success, there are 50 more that are just getting started.
We want to help businesses understand the journey ahead. It was with that idea in mind that we planned our keynote Monday, May 15 at Informatica World 2017 in San Francisco. This session is a must-attend for every businessperson developing a strategy to use digital information to be more responsive, agile and profitable.
MDM’s critical role in digital business
The reasons for digital transformation may vary: Top drivers include customer experience transformation, regulatory compliance and data-driven sales and marketing. But in every case data is at the heart of the journey. Master data management (MDM), which gives context to big and small data alike, is the key to delivering these business drivers.
MDM is a required core competency for digital business because only MDM delivers context and semantic consistency across data domains. It connects distributed data across the cloud and on-premise systems to ensure the data relationships are managed efficiently in one place. Organizations can leverage that consistency to create a customer-centric view and move away from traditional product-centric views.
These recent use cases showcase the power of MDM to achieve different transformation goals:
- Customer experience: Employees of Hyatt Hotels are able to track a variety of guest behavior and preference data and share it across the global chain. This enables them to delight guests with customized service, for example, a guest who does yoga in her room is offered use of a yoga mat wherever she travels. Read Hyatt’s story.
- Regulatory compliance: Federal mortgage lender Freddie Mac created a golden record of all customers, which allows it to compile accurate audit information far more rapidly and cost-effectively. It can even be automated to meet recurring compliance obligations. Learn more about finance industry challenges.
- Big data analytics: By building engines embedded with IoT sensors, GE aviation was able to stop selling capital equipment and charge airlines instead for usage. But that model only works if they know exactly how well each engine is performing. MDM provides the business-critical asset profile, asset service history, and customer information they need to understand the complete picture. Learn about MDM and big data.
- Mergers and acquisitions: When giants Dell and EMC undertook the biggest technology merger in history, their incongruous data systems rose quickly to the top of the challenge pile. Fortunately, executives understood the tremendous business opportunities that would come from data integration. An upcoming session at Informatica World will share details on how Dell-EMC tamed its data chaos and increased market competitiveness with MDM.
Developing the transformation skill-set
While MDM transformation isn’t simple, it is a must do. Organizations can begin with an ambitious plan or a simple one; some are midway through their transformation and others are getting started. Informatica World attendees will find relevant information for whatever stage of the journey they are at.
I believe our keynote session, titled “Master Data Management Industry Perspective,” is can’t-miss content for those developing a strategy to deliver trusted and usable real-time data to their businesses in pursuit of their digital transformation goals. Attendees will learn:
- Where MDM is headed in the future
- How other business leaders are using MDM in major business transformations.
- How MDM can make you a hero in your organizations by helping you solve complex data challenges.
The featured speakers include:
- Suresh Menon, SVP and General Manager for MDM at Informatica, will open the session with a vivid picture of where MDM is going and the new capabilities emerging in MDM to improve the long-term value proposition of any digital transformation effort.
- Next, Holger Muller, VP and Principal Analyst at Constellation Research, will discuss how social media and other digital trends are making harder for companies to do what they need to do. Muller will also share some inspiring use cases from companies that tackled difficult challenges.
- A panel discussion will follow, with Menon and Muller joined onstage by business executives with significant MDM implementation experience. They will share how MDM has helped them segment their marketing and communication to optimize customer experience and revenue. Panelists include:
Source: Informatica Perspectives
Source: Informatica Press Releases
Source: Informatica Press Releases
Are you registered to vote in the UK? Yesterday Prime Minister Theresa May announced a General Election to take placeRead more…
Are you registered to vote in the UK? Yesterday Prime Minister Theresa May announced a General Election to take place on 8 June. The election hasn’t officially been called but it’s likely the deadline to register to vote will be midnight Monday 22 May in order to be eligible to vote in the 2017 general election. And did you know that being on the Electoral Roll also could help improve your credit score?
Here are five things you should know about registering to vote:
1. How can it help improve your credit score? It’s important that your credit report includes your Electoral Roll details, as lenders use this information to help confirm your name, address and where you’ve lived before. This info usually has to be up to date before they are willing to offer a mortgage, a loan or any other form of financial account.
2. How could it negatively affect you if you aren’t on the Electoral Roll? Not being registered could cause a delay when you apply for credit, while the lenders confirm your details some other way. With some lenders it can even hurt the credit score they give you, and some applications may even be turned down.
3. What if I move around a lot? If you are living at a temporary address, it’s also possible to use your parents’ address for things like the Electoral Roll and as a base for your credit agreements. This might even be safer, in terms of the risk of identity fraud, especially if your temporary address has shared access.
4. How long does it take? Once you’ve registered, it may take a little while for this information to appear on your credit report, as councils usually process updates to the Electoral Roll once a month and send the information to the credit reference agencies like Experian. These updates can also be suspended for a few months if a council does an ‘annual canvass’, where they carry out an audit of all households. If you register to vote for the first time or at a new address, your credit report should automatically be updated within around a month, but it could be worth checking with your local authority to make sure.
5. So how do I do it? You can register to vote, or update your name, address or other details on the electoral register at Gov.uk. If you would like to find out about elections in your area, visit Your Vote Matters and type in your home postcode.
Sensitive data is at the heart of all data breaches and protecting it is among most organizations’ top priorities. ButRead more…
Sensitive data is at the heart of all data breaches and protecting it is among most organizations’ top priorities. But why is protecting the data crown jewels so difficult? Firstly, regardless of controls on how data is exchanged among organizations and countries, data spread is unavoidable.
Having visibility to the spread is a first order problem before you can protect it and detect potential threats. The big challenges that organizations must first solve to protect sensitive data are:
- Lack of enterprise visibility to sensitive data
- Not knowing where to start – what / where are the most critical data to protect?
- Inadequate data protection
- Inaccurate and slow detection of threats – insider activities are just as important as hackers – around 20% of data breaches are caused by insiders or 3rd
Solutions for detecting insider threats and protecting sensitive data need to address these problems.
Enterprise Visibility – Confirm What You Know and Learn What You Don’t Know
When asked whether they know where all their sensitive data are located, some information security professionals, data owners, and enterprise architects may believe that they have this information. But with the amount of data proliferation, shadow IT, and information sharing across individuals and organizations, is their confidence false or do they have true visibility? Regardless of the state of knowledge within your organization, it is always prudent to confirm what you know and learn what you may have missed. For this purpose, enterprise sensitive data discovery and classification is an essential first step.
Discovery and classification is usually a combination of an automated process for scanning data stores across the enterprise to identify sensitive data and iterative human curation. A solution that scans data stores across the enterprise need to be able to scale for hundreds to thousands to tens of thousands of data stores and provide both an offline and interactive approach to review the results in a meaningful way for data owners and security staff to act on it.
Classification policies also need to be flexible and support advanced rules and analytics to enable the identification of the combination of data elements that truly make up sensitive data. For example, a person’s name by itself is not sensitive data, but combined with the individual’s address or phone number or email address would be. Rich enough rules to express the possible combination of data elements that could constitute sensitive data is required, especially with regulations such as GDPR. Also, data that may be sensitive now may not be in a month. An example of such information is a public company’s quarterly results – which is confidential until its quarterly reporting announcements. Therefore, ability to determine sensitivity based on time increases the accuracy of the discovery.
Visibility is not only about knowing what sensitive data you have and where it is, but also:
- What is its level of sensitivity?
- What is its value to the organization?
- Where did it come from?
- Where has it gone to?
- Where has the data been copied, moved to, shared with?
- What other data stores has it been propagated to?
- Has it crossed regional boundaries governed by inter-country regulations?
- Who has access to it?
- Who accessed it?
- How was it accessed? From where? How often?
- What are the regulations governing it?
- Is it protected?
- How is it protected?
Risk Driven Approach to Prioritize Data Protection
All of the above information about sensitive data determines the risk associated with it, how you should prioritize its protection, and how it should be protected. Based on the risk assessment: weigh the likelihood vs. impact of the risk, you can determine whether to avoid, mitigate, transfer, or accept the risk.
If you decide to avoid or mitigate, you can determine the method. The following are among the possible methods depending on the types of threats to defend against:
- Encrypt (device, file, database, field level)
- Mask (persistent or dynamic)
- Control access
- Archive (Manage retention)
- Notify Owner
Once you’ve determined the priority of protection and how a data set should be protected, the deployment and enforcement of protection needs to be automated and easy to apply to multiple data stores across the enterprise. A policy-based approach allows the definition, application, and enforcement to be standardized across the organization and easily managed and monitored.
Bridging the Gap between Policy and Enforcement
Usually, the staff who defines what data and how they should be protected are different from those who actually implement and enforce the protection. How do you ensure that the defined policy are actually enforced? An automated method to orchestrate the application of policies is required, so that there is an automated hand-off, audit, and resolution of the protection enforcement on target data stores.
Timely Detection of Threats to Sensitive Data
The average time between a breach and detection is over 100 days. Detection is even more difficult when it involves employees, contractors, administrators, business partners, customers, or compromised credentials. Detection of insider threats requires monitoring and logging of users’ activities and advanced analytics and machine learning to baseline a user’s and its peer group normal behavior. With a baseline, you can then identify unusual or suspicious activities that deviates from the norm.
Real-time Closed Loop Remediation
When anomalous activities or violations are detected, remediation actions need to be taken to further review and investigate the incidents, resolve them or remediate the problem. The detection system should trigger an automated workflow for review by the security team and resolution or remediation activities to be taken by down stream owners.
Discovery, detection, orchestration, remediation, and continuous risk monitoring and management are critical capabilities for a data security solution that provides enterprise wide visibility and control of sensitive data. The timely detection of insider threats and prioritization of remediation are key challenges that few solutions do well with high accuracy and reduce the alert fatigue experienced by security analysts.
To learn more ways to reduce the risk of data breach while improving compliance and governance, we invite you to join us at Informatica World 2017, May 15-18 in San Francisco.
Source: Informatica Perspectives
The 2007-2008 global financial crisis led to multiple regulations created to protect citizens, banks and the economy itself. Dodd-Frank, MarketsRead more…
The 2007-2008 global financial crisis led to multiple regulations created to protect citizens, banks and the economy itself. Dodd-Frank, Markets in Financial Instrument Directive (MiFID) II, BCBS 239 etc. all define numerous principles, guidelines and imperatives that need to be complied with, sometimes at very high costs. But even after spending millions of dollars there remains a lot to be done, especially to accelerate regulatory compliance.
A case in point is BCBS 239. It consists of 14 principles that need to be adopted for full compliance. The principles were first published in Jan 2013 with the implementation deadline of Jan 2016 for G-SIBs (global systemically important banks). A recent report however suggests that only 1 GSIB could satisfy all the required principles and the Basel committee is now urging banks to step up efforts to comply.
Complying with these regulations is a complex, enterprise-wide exercise that requires consolidation of information across a broad range of functions and legal entities. However, while there are multiple regulations, a common data management theme across all of them is- the need to be transparent. For example, when banks are asked to report aggregate risk metrics for BCBS 239, they are also asked to prove to the regulators that the risk scores have been arrived correctly by:
- Showing data lineage from data source to report: where was data originated, how it was transformed, what processes and decisions affected its life-cycle
- Demonstrating Data Quality Controls: identification, assessment and management of data quality
- Displaying enterprise wide understanding of business concepts: by making a data dictionary of concepts available to appropriate teams in the organization
A banking CDO that I was talking to recently, had a team of 20+ people charged with manually drawing lineage diagrams for BCBS 239. Over two years, they had developed lineage diagrams for 200+ risk metrics. While these diagrams included multiple dimensions like quality, people, processes etc. they were:
- Onerous to create: as it required for this team to talk to multiple data owners across functions to understand which key data elements were stored and how they moved
- Onerous to maintain: as it required the team to keep track of even small changes that happened within the source systems
This is where Informatica can help. With Informatica’s integrated data governance stack*, customers can now take advantage of an intelligent solution for data governance and compliance. Below, I cover a few ways in which Informatica’s Enterprise Information Catalog, the machine learning based data discovery and cataloging solution can accelerate compliance by providing automated mechanisms to make existing data management practices be more transparent:
Smart Cross-system Data Discovery: Using machine learning and rule based approaches, Enterprise Information Catalog can automatically identify key data elements by scanning enterprise data sources. EIC has pre-built connectors to automatically scan a wide variety of data sources including databases, data warehouses, Cloud Applications, Big Data Sources, Cloud databases, BI Tools, Modeling tools and more. Additionally, the domain discovery capability allows EIC to auto-infer whether a key data element exists in a resource by looking for data patterns, column names, reference data values, etc. This step saves the governance team multi-weeks of effort as key data elements are automatically identified across systems without the need to manually look for them in meetings.
Automated Data Lineage Diagram creation: EIC extracts metadata from multiple sources including ETL tools, BI Reports, SQL Procedures to create a comprehensive and end-to-end lineage diagram automatically. Because this information is extracted from source systems directly, it is much more reliable than hand drawn diagrams, which could have stale data. A catch here is that due to the nature of the data movement processes in organizations, it may be difficult to capture all data movement processes automatically. Especially manual tasks (hand keying data) or data movement that uses proprietary hand coded applications. However, the technical lineage diagram can provide the governance team a head start, again saving them weeks of effort. It can also help in validating the accuracy of existing lineage diagrams.
Collaborative Catalog for Business and IT: It is especially important for regulated organizations to have both business and IT aligned across data definitions, policies and initiatives. This ensures consistency in the way data is created, used and communicated across the organization. Enterprise Information Catalog provides an unified view of all metadata (business, technical, relationships, quality, lineage, usage…) which is accessed through a powerful search interface and provides business and IT a single place to collaborate on data assets. All the work done in defining and enriching business context around data assets, is also used by business users in discovering the right data asset for their discovery needs.
More regulations are around the corner. In almost a year, all organizations handling EU citizens’ personal data must comply with the new General Data Protection Regulation (GDPR). Non-compliance will cost up to US$22 million or 4 percent of annual global turnover and the deadline is May 2018! CDOs and Organizations do not have the luxury to react to individual regulations separately anymore. Instead it is now time to look at principles like transparency and automation which are core to any governance strategy.
*Informatica recently introduced Informatica Axon, the industry’s first fully integrated, enterprise data governance solution. Axon integrates seamlessly with Enterprise Information Catalog and other industry-leading Informatica data management solutions for data quality, master data management, big data and cloud to form the only complete, unified data governance offering, for any market and any size enterprise.
The post Accelerate Regulatory Compliance with Informatica appeared first on The Informatica Blog – Perspectives for the Data Ready Enterprise.
Source: Informatica Perspectives
The dynamic world of hybrid computing creates tremendous opportunity for organizations to share, analyze and monetize their data. Unleashing theRead more…
The dynamic world of hybrid computing creates tremendous opportunity for organizations to share, analyze and monetize their data. Unleashing the wealth of information in vast data stores provides competitive advantage and the ability to forge satisfied and loyal customers.
And the new uses and applications for data mean a continuously expanding landscape of users and data stores and the growth and proliferation of the data itself. Static data stores and perimeters have vanished, replaced by environments where change is the only constant.
The security and governance of this environment is key; organizations must have a current and accurate view of their data to achieve compliance with external regulations and privacy standards promised to customers. While traditional security remains important, data security practices, policies and controls must adapt and leverage intelligence, monitoring and protection of the most precious assets; sensitive data.
At Informatica World 2017, our customers and partners will highlight their innovative programs for dealing with their dynamic data environments consisting of cloud, data lakes and traditional data stores; how they secure customer data, how they ensure data protection controls are applied where they are needed and where they will have impact, how they will deal with the looming General Data Protection Act (GDPR) and what is the road to successful data archiving. Our product teams will provide insight into our award-winning data security products.
At Informatica World 2017, you will hear from customers, experts and partners who will provide insight on how to reduce data risk by reducing the risk of breaches and internal misuse:
- Customer sessions on the identification and protection of critical customer data and how to leverage data security intelligence to identify and prioritize data protection programs.
- Expert sessions on cloud and data security, on GDPR readiness and how to unleash DevOps efficiency with Test Data Management.
- Product specialist will review the latest innovations in Informatica’s data security solutions, including Secure@Source and Data Masking.
- Roundtable, Deep dive and Meet the Expert stations where you can discuss and experience data security, GDPR readiness and compliance.
- Solutions Expo where you can learn about our current offering and their key capabilities. From interactive demos, you will be able to see how these solutions can help you reduce sensitive data risk.
- Customer Advisory Boards will help us continue our innovation and quality.
Details on our session can be found in the Informatica World Catalog, simply select the Data Security track. You will also find Informatica Data Security featured on the main stage during our general sessions.
This is a must attend event; architects, security professionals, data governance and compliance practitioners and management will benefit from the insights into how to create lower risk architectures, improve data security and govern sensitive data for compliance.
Source: Informatica Perspectives
Informatica World is a unique place to gain knowledge around all things data. In my recent post, I shared keyRead more…
Informatica World is a unique place to gain knowledge around all things data. In my recent post, I shared key reasons why I’m looking forward to Informatica World 2017. One of these reasons is the chance to learn new Master Data Management skills.
Increasing your Master Data Management skills can help your company get ahead, as well as make a crucial difference in your personal career options. With that in mind, here are some things you can do at Informatica World 2017 to increase your Master Data Management expertise.
1) Get an Industry Perspective on Master Data Management
This year, Informatica World 2017 offers an Industry Perspective session on Master Data Management. In this session, Constellation Research will share how cloud and social media have made Master Data Management essential to organization survival. During the session, you’ll learn specific ways to use MDM to turn big data into big insights. And, throughout the week, you’ll hear examples of partners and customers who’ve used MDM to improve customer experience, optimize business processes and ensure regulatory compliance. You’ll also hear from our customers and partners about how they used their Master Data Management skill to become heroes at their companies.
2) Learn What’s New with Informatica MDM Solutions
In session MDM101, “What’s New with Informatica MDM Solutions,” you’ll walk through an overview our portfolio, highlighting both what’s available today and what’s yet to come. You’ll learn new ways to improve customer experience and increase compliance. You’ll take a look at modular end-to-end MDM solutions. Specifically, you’ll review Product 360, Customer 360, Supplier 360, and the new Relate 360.
3) Learn how Master Data Management Skills Increases Mergers and Acquisition Success
Mergers and acquisitions always involve data and Master Data Management is a key driver in their success. MDM allows companies to streamline mergers, realizing synergies more quickly, as well as reducing risk. An effective MDM strategy not only helps preserve the business continuity during the transaction, but also serves as a driver for innovation. In session MDM102, the “Importance of MDM in M&As : A Dell-EMC Case Study,” you’ll learn how these large companies integrated business-critical data and increased their market competitiveness.
4) Learn how Master Data Management Skills Help Reduce Risk and Avoid Fraud
American International Group, Inc., also known as AIG, is an American multinational insurance corporation with more than 88 million customers in 130 countries. AIG initiated an MDM initiative to ensure compliance with strict government regulations and to better understand customers. Prior to MDM, their architecture did not provide a holistic view of customer and transaction information. In session MDM103, “How AIG Insurance uses Informatica MDM to Reduce Risk and Avoid Fraud,” you’ll learn how AIG strategically manages customer data by consolidating fragmented, duplicated and inconsistent data. Specifically, you’ll see how MDM helps AIG comploy with governmental regulations, reduce risk exposure and avoid potential fraud.
5) Learn how to Market and Sell Master Data Management Within Your Company
BJC Healthcare and Washington University School of Medicine embarked on the MDM journey after Informatica World 2016. In session MDM104, “How and Why I Market & Sell MDM Internally at BJC Healthcare,” you’ll learn ways to get buy-in from your executives for a trusted data initiative. You’ll hear about the capabilities of an end-to-end MDM platform, where to start with MDM, how to create a business case, and how to explain the value of enterprise MDM initiative.
6) Learn How to Harmonize Customer Master Data in Salesforce.com
McKesson identified a need to consolidate consolidated customer accounts within their SalesForce orgs with accuracy and precision. In session MDM105, “Harmonizing Customer Master Data in Salesforce.com,” you’ll learn how they worked with Slalom Consulting and Informatica to customize their data governance. Their newly optimized Salesforce environment provides them with single view of the customer. This allows McKesson to streamline how teams work with and improve customer data. Specifically, you’ll learn how this initiative has improved productivity and enhanced their application landscape.
7) Learn How Master Data Management Skills Help Gain Customer Insight
The Moller Group is Norway’s largest car importer. Moller Group’s 4,000 employees serve more than 750,000 customers and 67 car dealers for brands like Audi, Volkswagen and Skoda. In session MDM106, “Driving Customer Insights and Addressing Disruption with MDM at Moller,” you’ll learn how Moller combines customer data from thousands of sources. This allows them to provide a central hub for all customer-to-car relationships, which recommends the next best action for the customer in question. Specifically, you’ll learn the importance of multi-domain MDM. You’ll also hear their plans to use MDM to provide context to disruptive changes in the auto-industry, such as autonomous cars and IoT.
8) Learn How Master Data Management Can Transform Retail Outcomes
DFS Group Limited is the world’s leading luxury retailer catering to the traveling public. Their global network reaches over 35 million travelers each year. They offer products across a diverse set of categories, including Beauty and Fragrance, Fashion and Accessories, Food and Gifts, Watches and Jewelry, and Wines and Spirits. In session MDM107, “Luxury Travel Retailer DFS’ Experience with Informatica MDM – Product 360”, you’ll learn how DFS uses Master Data Management to transform their merchandising capabilities and enable their digital aspirations.
9) Learn How Master Data Management Can Help Provide Better Care for Children
Illinois Department of Innovation & Technology (DoIT) is a state agency responsible for the information technology functions under the jurisdiction of the Governor. In conjunction with the Health and Human Services Agencies, DoIT uses Master Data Management to create a holistic view of an individual. This enables their departments to serve their citizens more effectively. In session MDM108, “Delivering Better Service to Kids in Need Using MDM at State of Illinois” you’ll learn how MDM helps the Department of Children and Family Services (DCFS) provide better care for children. They will also discuss future plans to support Behavioral Health Services, Medical Services, and Program Eligibility to improve outcomes.
10) Get an Enterprise Architect’s Point of View on MDM
Master Data Management is critical to every data-driven initiative in any organization. It is at the heart of enterprise architecture breaking the silos and providing a consistent version of the truth about customers, products, suppliers and more. In session MDM109, “Building Blocks of Enterprise MDM: An Enterprise Architect’s Point of View,” you’ll learn the critical architectural considerations you need to make to ensure a successful MDM implementation.
11) Learn Master Data Management Skills for Healthcare
Trinity Health is a national, not-for-profit Catholic health system operating 93 hospitals in 22 states. Their approach is “people-centered health care” with a focus on better health, better care, and lower costs. With this mission in mind, they realized they needed to create a single-source of master practitioner data. In session MDM110, “The Art and Science of Governing Master Data: Trinity Health’s MDM Journey,” you will learn how they connected metrics to business requirements, MDM data governance, and challenges and lessons learned along the way.
by integrating practitioner credentialing applications, registration and financial systems, practitioner human resource applications, and more.
12) Learn Master Data Management Skills for Life Sciences
As their business model evolves, life science companies are increasing adopting new strategies. These strategies include value-based contracting, key account management, multichannel marketing and pricing optimization. ZS Associates is a global consulting leader who, in partnership with Informatica, has developed an MDM implementation specifically to address these needs. In session MDM111, “Informatica & ZS Associates Partner to Bring New Life Sciences MDM App,” you’ll learn how you can realize efficiency, effectiveness and experience with this solution. This implementation provides pre-built connectors, data model configurations and extensions, cleansing and validation rules, business processes and workflows in an easy-to-use UI built for life science professionals.
Grow and Show Your Master Data Management Skills
Well, there you have it: New ways to grow your own Master Data Management skills, just by joining us at Informatica World 2017. And there are SO many more – that’s just a snapshot of what is available. So if gaining these skills is a personal goal of yours, I invite you to register today!
Source: Informatica Perspectives
Absolutely nobody disputes the potential value of big data. It provides an economic way to ask new analytics questions thatRead more…
Absolutely nobody disputes the potential value of big data. It provides an economic way to ask new analytics questions that we were never able to ask before. And that is possible because we are able to combine new, large, and widely disparate data sets in ways that were never economically possible before. The challenge people are now facing is that it getting harder and harder to show business value.
Getting to Business Value
There is an amazing amount of innovation going on around big data technology. Practically every day some new technology or farm animal is announced. The challenge for all of us is that the rate of innovation can get in the way of delivering actual business results. Here are a couple of common examples:
- When MapReduce first came out, many people jumped on that opportunity and started writing great code with it. Then, Spark came along. Spark was so cool and interesting that many organizations decided to drop what they had been doing with MapReduce and move to Spark. And that meant a total re-write of all that code that had written, with the loss of thousands of programmer-hours. It is highly likely that there will be a high level of technology change for the foreseeable future. Who knows when we will see a Spark replacement?
- On another vector, imagine having to manage a big data stack. To keep a modest sized big data environment functioning you are probably looking at a minimum of 6-12 different technologies, for storage, computing, data warehouses, and higher-level analytics. Not to mention data discovery, data prep, data security, data quality and governance, and data visualization. An incredible amount of time is being spent keeping all of those technologies current and integrated with each other. No analytics organization I have ever spoken to wants to be in the system integration business. They want to be delivering actionable insights for their organizations.
- Even more interesting, a great deal of new big data and analytics innovation is starting to appear in the cloud. For example: I don’t think we will see Google Deep Learning offered on-premise any time soon. There are indisputable and well-documented advantages to using the cloud. But a hybrid environment will also mean a higher degree of challenge in designing and managing a hybrid data management architecture that connects the data in the clouds (probably plural) with on-premise systems.
The example below shows just some of the common big data technologies. There is a general progression from older technologies to newer technologies as you move from left to right.
The Only Constant in Big Data Is Change
The question becomes: how do you leverage the best technologies available while still maximizing the return on technology investment in big data for your organization?
You will never get there if you spend most of your time on the big data technology change treadmill. What is required is a data management platform that will enable you to run the big data technology that best fits your business need but abstracts that from the process of data management development.
In the example below, a data management platform separates the Data Visualization and Analytics layer from the underlying big data technologies; Compute, Storage, Distributions, and Data Warehouses.
What to Look for in a Data Management Platform
In any organization with a data-centric strategy, hand coding just will not scale to enterprise-class problems or to larger groups of developers. In this environment data must be a shared resource available to any system, process, or data self-service. Thought-leading organizations that are taking a taking a different approach. They are using data management platforms that provide:
- An end-to-end solution: Full data management includes data discovery, data integration, data quality, data prep, master data management, data security, data governance and more. This should be integrated.
- Modularity: You shouldn’t have to buy the entire platform at once. You should be able to start where it makes sense for you and grow your data management capabilities at the pace that is comfortable for you.
- Abstraction: The platform development environment must provide a layer of abstraction between the development layers and the underlying big data technology. You should be able to code once and have the platform intelligently determine the best engine to run the code on. And it will help a lot if the platform supports the most current engines available.
- Hybrid: The platform must be able to manage data wherever it resides, cloud, on-premise, big data, or something completely different.
- Intelligence: in 2017 IT budgets are starting to grow after many years of flat budgets worldwide. But, that will not be enough to scale to the needs of organizations who are looking to compete based on their use of data and analytics. The platform must accelerate productivity by providing intelligence to make recommendations, and automate tasks such as parsing and relating new data for greater understanding.
- Self-service: IT will play a role in delivering data that is ready for business use, but after a point, it makes sense to enable the subject matter experts, the business analysts, to do their own data prep and visualization.
If all of this seems a bit visionary, I highly recommend that you attend this year’s Informatica World 2017, coming up May 15-18th in San Francisco. In this blog, I have only hinted at some of the things that will be announced at this event. You will really need to come see for yourself how data management is being re-imagined to deliver greater, faster business value.
Source: Informatica Perspectives
The artificial intelligence revolution is gaining critical mass within organizations. A new survey of 835 executives across the globe byRead more…
The artificial intelligence revolution is gaining critical mass within organizations. A new survey of 835 executives across the globe by Tata Consultancy Services (TCS) finds some 84% are using some amount of AI technology today in their businesses. Areas being touched – or soon to be touched — by AI include IT, sales, marketing, customer service, finance, strategic planning, corporate development, and HR.
Before executives begin to bet their businesses on AI, however, they’re going to have to make sure that the applications have been fed all the data they need. This may take time, as AI-based applications may require time to “learn” until they are fully proficient. One may argue that machines are very different from people, but they both need to learn from experience, provided with continuous streams of data. Humans have been learning new things for thousands of years, but machines have only started to develop the capacity to learn in the past couple of years, through artificial intelligence and machine learning.
In a recent Harvard Business Review article, Ajay Agrawal, Joshua Gans and Avi Goldfarb, all with the University of Toronto, explored the developing AI phenomenon, and the challenge organizations face as they move forward with it. As is the case with humans, learning is more critical for some systems than others – say a fast-food worker versus an airline pilot. Some learning is “good enough,” while other learning needs to be precise.
That ‘good enough’ principle applies to artificial intelligence as well. That’s where things get risky for enterprises. An AI-enabled application may need to do some “learning” before it is ready to assume heavy lifting without human intervention. Agrawal and his co-authors point to autonomous cars as a case in point – there is life-and-death risk in putting these systems out on the road, but this is the only way the systems can learn and improve responses to various situations.
On a corporate level, it’s equivalent to trusting financial systems to AI, even though applications are rudimentary and still require a learning process – input from data of real-world situations. “Machines learn faster with more data, and more data is generated when machines are deployed in the wild,” Agrawal and his co-authors observe. “However, bad things can happen in the wild and harm the company brand. Putting products in the wild earlier accelerates learning but risks harming the brand (and perhaps the customer!); putting products in the wild later slows learning but allows for more time to improve the product in-house and protect the brand (and, again, perhaps the customer). As more companies seek to take advantage of machine learning, this is a trade-off more and more will have to make.”
Tolerance for error is a key consideration. An email filtering application doesn’t need exacting standards; but autonomous driving does. If there is an application with a low tolerance level for error, which may mean a considerable amount of data needs to be directed at the application as part of a continuous improvement feedback loop.
Organizations also need to consider how important it is to capture user data in the wild. There are countless inputs that may shape the decisions an AI application makes, so enterprises need to sort through what is relevant and impactful and what is just a mountain of data.
Source: Informatica Perspectives
Source: Informatica Press Releases
Source: Informatica Press Releases