Mainline provides data storage and protection architecture design services to define a client's current and future storage architecture. This includes documenting the existing architecture, facilitating workshops to define assumptions and rules to guide the new architecture, and producing logical and physical diagrams of the architecture at 12, 24, and 36 months. The goal is to align storage infrastructure like SANs, arrays, and backups with strategic needs and growth requirements.
The document discusses Mainline's Storage Investment Analysis service which calculates the total cost of ownership (TCO) and loaded cost per terabyte of a company's storage infrastructure. The service delivers a presentation of the results, a TCO and ROI calculator, and optionally provides a cost breakdown per storage service class. It helps clients improve infrastructure utilization and identify projects that do not achieve a return on investment. Mainline can also incorporate the analysis into a larger storage assessment to further enhance results.
This document discusses archive and compliance services from Mainline Information Systems. It explains that Mainline can help organizations comply with regulations by defining what data needs to be archived, building compliance and archive policies, implementing archive solutions, and migrating applicable data. Mainline's archive and compliance service establishes a company-wide process for assessing regulations' IT impacts, identifying risks, defining owners and architecture, and acquiring archive storage solutions. The service is part of Mainline's larger storage assessment methodology, which can deliver related services together or individually.
Built on Oracle Analytics Cloud and powered by Oracle Autonomous Data Warehouse Cloud, Fusion Analytics Warehouse
(FAW) provides Oracle ERP and HCM Cloud Application customers with best-practice key performance indicators (KPIs)
and actionable insights driven by advanced analytics
The document discusses Software-Defined Storage (SDS), which virtualizes storage such that users can access and control it through a software interface independent of the physical storage devices. SDS has advantages over traditional network storage systems like SAN and NAS in that it has lower costs, greater flexibility and agility, better resource utilization, and higher storage capacity. It divides storage functionality into a control plane that manages virtualized resources through policies, and a data plane that processes and stores data.
How to create a successful data archiving strategy for your Salesforce Org.DataArchiva
油
Data archiving has been proved to be one of the most effective approaches when it comes to managing Salesforce data growth and storage space. You can seamlessly archive your Salesforce data using Big Objects and save significant data storage costs.
Learn more about Cisco BDWE. Appfluent Visibility named part of Ciscos Big Data Warehouse Expansion, a solution to help customers control costs, manage expanding data warehouses.
Seven Essential Strategies for Effective ArchivingEMC
油
Archiving helps organizations effectively retain, manage, and leverage their information assets. It is also complimentary to backup and broader data protection activities. This paper outlines seven essential strategies for archiving that drive cost savings, risk reduction and IT transformation.
Vibrant Technologies is headquarted in Mumbai,India.We are the best Data Warehousing training provider in Navi Mumbai who provides Live Projects to students.We provide Corporate Training also.We are Best Data Warehousing classes in Mumbai according to our students and corporates
Mainline provides an Infrastructure Data Analysis service to identify invalid or unnecessary data stored on clients' storage systems, recommend proper placement of valid data across storage tiers, and define retention policies. The service analyzes metadata using a cloud-based tool to profile data usage and characteristics, with deliverables including storage capacity analysis, opportunities to clean up data, and data placement recommendations. This helps optimize storage usage, reduce costs, and ensure critical business data is stored appropriately.
Mainline provides process engineering services to develop ITIL-compliant operational processes for storage and backup. Their services include developing customizable processes to address top issues, creating auditable and trainable processes, and linking processes within an interconnected model. Their process engineering helps reduce errors, improve root cause analysis, and enable process training for teams. It is part of Mainline's larger storage assessment methodology.
The document provides information about Mainline's Enterprise Storage Assessment service. The assessment evaluates a client's current storage environment, processes, organization, and governance to identify gaps. It then provides deliverables including an overview of the current environment, operational processes, team responsibilities, governance measures, and outage logs. The assessment also defines a target future environment and provides a prioritized roadmap of recommendations to address identified gaps. Mainline's assessment aims to help clients improve their storage services and reduce outages.
The document discusses Mainline's Storage Strategy Workshop service. The workshop helps storage and IT teams develop a strategic plan to guide infrastructure decisions. It involves identifying business challenges, current projects, team skills, objectives, and a vision and mission statement. This ensures the team is proactive in addressing issues rather than reactive. The strategy provides guidance that empowers both management and staff. It is meant to be fluid and change over time in response to evolving needs.
The document discusses Application Landscaping, which is a process for classifying applications into business tiers based on their non-functional requirements. The process involves identifying representative applications, determining key requirements, scoring the applications, and grouping them into service classes (e.g. platinum, gold, silver). This provides a standardized way to deliver storage services and reduces custom requests. The deliverables include documentation of the process, application scoring results, service class definitions, and optional additional analyses.
The document discusses Mainline Information Systems' Service Catalog Design service. The service aims to transform clients' custom service request processes into a finite set of standardized service choices aligned with application needs. It does this by building a service catalog that outlines the storage and backup services offered to internal clients sequenced over three options. This reduces unique requests and improves customer satisfaction by providing a standardized process and set of solutions. The service catalog is further enhanced when combined with Mainline's Application Landscaping and Data Analysis services.
The document describes Mainline's Service Readiness Assessment, which evaluates a company's storage and backup service delivery capabilities. It involves interviews to collect information on processes, organization, technology, and governance. Mainline then scores the results and provides recommendations to improve service delivery compared to industry averages. The assessment takes about 3 hours and provides a same-day initial score and recommendations within a week. It is part of Mainline's larger storage assessment methodology.
Daniel Reznick has over 20 years of experience in technical IT management roles. He has extensive experience managing complex network, system, and storage architectures in mission critical environments. Some of his strengths include data center technologies, leadership, process improvement, project management, and problem solving. He is proficient in technologies such as Cisco, VMware, Citrix, Microsoft, Linux, and storage solutions. Reznick has held roles such as Senior Escalation Engineer at Cisco and Global Technical Support Manager at Whiptail Technologies where he managed technical support teams and infrastructure.
White Paper-2-Mapping Manager-Bringing Agility To Business IntelligenceAnalytixDataServices
油
The document discusses how AnalytiXTM Mapping ManagerTM can help organizations build Business Intelligence solutions in an agile way. It does this by managing requirements, generating logical data models, automating source to target mappings, versioning changes, and providing visibility into the data integration process. This allows organizations to focus on high priority requirements, prove solutions early, and rapidly deploy Business Intelligence while managing changes.
aThe GRT DWBI Development Approach is a comprehensive approach t.pdfanandinternational01
油
a:
The GRT DW/BI Development Approach is a comprehensive approach to the design,
development, and implementation of data warehouse solutions. It is based on input from several
sources and from the direct experience of the GRT Consulting. The GRT DW/BI Development
Approach has evolved to be responsive to the dynamic nature of this business area. It is the
synthesis of a detailed methodology for developing data warehouses and a methodology for
doing so on an incremental basis.
This generates early value for the business during the implementation process while ensuring the
quality of the overall implementation effort. The foundation is the set of Data Warehouse
Development Processes, which are summarized immediately below. An overview of The
Incremental Approach and then of the resulting GRT DW/BI Development Approach itself
follow that discussion.
b:
The sections below provide perspective on each of the critical processes in a data warehouse
development effort. In addition to the details discussed in each section, numerous control
mechanisms and management techniques to facilitate the success of the overall project support
each process.
primary responsibilities: DATA QUALITY , WAREHOUSE
ADMINISTRATION,METADATA MANAGEMENT,DATA ACCESS,DATABASE DESIGN
AND BUILD,TESTING,TRAINING ,TRANSITION ,and POST-IMPLEMENTATION.
c: The Business Requirements Definition process defines the requirements, clarifies the scope,
and establishes the implementation roadmap for the data warehouse. With the direction of the
client organization, strategic business goals and initiatives are established and used to direct the
strategies, purpose and goals for each phase of the data warehouse solution.
Definition focuses on determining the specifics of the solution to be developed and delivered,
identifying the clients information needs, and modeling the requirements.
The objective of the Data Acquisition process is to identify, extract, transform, and transport the
various source data necessary for the operation of the data warehouse. Data Acquisition is
performed between several components of the warehouse, including operational and external
data sources to data warehouse, data warehouse to data mart, and data mart to individual marts.
The Data Acquisition Strategy is also developed to outline the approach for extraction,
transformation, and transportation of the source data to the data warehouse. The strategy includes
selecting a tool or set of tools as the data pump or defining the specifications of one that must be
built. If tools are to be utilized, high-level tool requirements, tool evaluations, and tool
recommendations are also addressed.
. Detailed analysis is performed on the data sources and a mapping is created between the current
state of the source data and the new set of objects that define the data warehouse. With the
mapping, a gap analysis is produced to validate that the information requirements can be met
with the available data. With the detailed analysi.
This document provides a summary of William McGrath's experience as an IT professional with over 26 years of experience in storage architecture, implementation, and maintenance. He has expertise in technologies from vendors such as EMC, NetApp, Cisco, HP, Oracle, and more. His most recent role involved architecting and implementing a hybrid cloud infrastructure delivering storage as a service.
Achieve New Heights with Modern AnalyticsSense Corp
油
Businesses can leverage modern cloud platforms and practices for net-new solutions and to enhance existing capabilities, resulting in an upgrade in quality, increased speed-to-market, global deployment capability at scale, and improved cost transparency.
In this webinar, Josh Rachner, data practice lead at Sense Corp, will help prepare you for your analytics transformation and explore how to make the most on new platforms by:
Building a strong understanding of the rise, value, and direction of cloud analytics
Exploring the difference between modern and legacy systems, the Big Three technologies, and different implementation scenarios
Sharing the nine things you need to know as you reach for the clouds
Youll leave with our pre-flight checklist to ensure your organization will achieve new heights.
This document discusses Oracle's value proposition for its big data solutions. Key points include:
- Oracle offers engineered systems that integrate hardware and software to securely manage both new and existing data types and formats for big data.
- The solutions allow customers to acquire, organize, analyze and make decisions from big data to develop predictive analytics and gain competitive advantages.
- Oracle partners with other companies through its Oracle Partner Network to increase sales and empower partners with Oracle resources and specializations.
- Oracle solutions serve many customer segments including telecommunications, energy, life sciences, healthcare, oil and gas, manufacturing, and retail.
UNIVERGE BLUE Backup as a Service Case Study: B & AInteractiveNEC
油
This case study describes how B&A, a government IT solutions provider, implemented a Backup as a Service (BaaS) solution from NEC to simplify their backup processes and improve data security and compliance. Key points:
- B&A previously had complex, manual backup processes across multiple providers that lacked visibility and compliance.
- NEC designed a BaaS solution using their HYDRAstor technology and hosting backups in Iron Mountain's secure data centers.
- This simplified B&A's backups, provided full visibility and verification of backups, and ensured compliance with government standards.
- The new automated solution has reduced costs and freed up B&A's IT staff to focus on other tasks
SaaS Application Scalability: Best Practices from Architecture to Cloud Infra...riyak40
油
By crafting a versatile and modular architecture, adopting microservices, and integrating robust load balancing, to leveraging auto-scaling, monitoring, and statelessness, every phase of development presents an opportunity to build a more efficient, responsive, and resilient application.
The document discusses planning technical architecture for the ServiceNow platform, including instances, integrations, and data flows. It emphasizes that architecture decisions should be driven first by business objectives and context. The key steps outlined are to: 1) develop a clear understanding of architectural needs, 2) define the instance and data architecture, 3) define the integration architecture, 4) manage the ServiceNow architecture, and 5) plan for expansion.
Software-defined storage (SDS) provides storage software that runs on standard server hardware to deliver data services. The document discusses the top five use cases and benefits of SDS, including reducing storage costs through scalable commodity hardware, improving performance by optimizing storage I/O, better provisioning and automation of storage resources, robust management of heterogeneous storage arrays, and tightly aligning storage with broader infrastructure management. SDS can lower costs while improving performance, efficiency, and flexibility compared to proprietary storage systems. However, SDS also presents challenges around integration, support skills, and interoperability that must be addressed.
Christopher King is an award-winning systems architect and storage engineer with over 9 years of experience integrating business needs with cost-effective technology solutions. He has extensive expertise in data growth and protection, disaster recovery design, and virtualization. King has a proven track record of managing large-scale projects and IT budgets up to $1 million. His experience includes roles at Yahoo!, Georgetown University, American Chemical Society, and The George Washington University.
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business youre in, youre in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the fourth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Mainline provides an Infrastructure Data Analysis service to identify invalid or unnecessary data stored on clients' storage systems, recommend proper placement of valid data across storage tiers, and define retention policies. The service analyzes metadata using a cloud-based tool to profile data usage and characteristics, with deliverables including storage capacity analysis, opportunities to clean up data, and data placement recommendations. This helps optimize storage usage, reduce costs, and ensure critical business data is stored appropriately.
Mainline provides process engineering services to develop ITIL-compliant operational processes for storage and backup. Their services include developing customizable processes to address top issues, creating auditable and trainable processes, and linking processes within an interconnected model. Their process engineering helps reduce errors, improve root cause analysis, and enable process training for teams. It is part of Mainline's larger storage assessment methodology.
The document provides information about Mainline's Enterprise Storage Assessment service. The assessment evaluates a client's current storage environment, processes, organization, and governance to identify gaps. It then provides deliverables including an overview of the current environment, operational processes, team responsibilities, governance measures, and outage logs. The assessment also defines a target future environment and provides a prioritized roadmap of recommendations to address identified gaps. Mainline's assessment aims to help clients improve their storage services and reduce outages.
The document discusses Mainline's Storage Strategy Workshop service. The workshop helps storage and IT teams develop a strategic plan to guide infrastructure decisions. It involves identifying business challenges, current projects, team skills, objectives, and a vision and mission statement. This ensures the team is proactive in addressing issues rather than reactive. The strategy provides guidance that empowers both management and staff. It is meant to be fluid and change over time in response to evolving needs.
The document discusses Application Landscaping, which is a process for classifying applications into business tiers based on their non-functional requirements. The process involves identifying representative applications, determining key requirements, scoring the applications, and grouping them into service classes (e.g. platinum, gold, silver). This provides a standardized way to deliver storage services and reduces custom requests. The deliverables include documentation of the process, application scoring results, service class definitions, and optional additional analyses.
The document discusses Mainline Information Systems' Service Catalog Design service. The service aims to transform clients' custom service request processes into a finite set of standardized service choices aligned with application needs. It does this by building a service catalog that outlines the storage and backup services offered to internal clients sequenced over three options. This reduces unique requests and improves customer satisfaction by providing a standardized process and set of solutions. The service catalog is further enhanced when combined with Mainline's Application Landscaping and Data Analysis services.
The document describes Mainline's Service Readiness Assessment, which evaluates a company's storage and backup service delivery capabilities. It involves interviews to collect information on processes, organization, technology, and governance. Mainline then scores the results and provides recommendations to improve service delivery compared to industry averages. The assessment takes about 3 hours and provides a same-day initial score and recommendations within a week. It is part of Mainline's larger storage assessment methodology.
Daniel Reznick has over 20 years of experience in technical IT management roles. He has extensive experience managing complex network, system, and storage architectures in mission critical environments. Some of his strengths include data center technologies, leadership, process improvement, project management, and problem solving. He is proficient in technologies such as Cisco, VMware, Citrix, Microsoft, Linux, and storage solutions. Reznick has held roles such as Senior Escalation Engineer at Cisco and Global Technical Support Manager at Whiptail Technologies where he managed technical support teams and infrastructure.
White Paper-2-Mapping Manager-Bringing Agility To Business IntelligenceAnalytixDataServices
油
The document discusses how AnalytiXTM Mapping ManagerTM can help organizations build Business Intelligence solutions in an agile way. It does this by managing requirements, generating logical data models, automating source to target mappings, versioning changes, and providing visibility into the data integration process. This allows organizations to focus on high priority requirements, prove solutions early, and rapidly deploy Business Intelligence while managing changes.
aThe GRT DWBI Development Approach is a comprehensive approach t.pdfanandinternational01
油
a:
The GRT DW/BI Development Approach is a comprehensive approach to the design,
development, and implementation of data warehouse solutions. It is based on input from several
sources and from the direct experience of the GRT Consulting. The GRT DW/BI Development
Approach has evolved to be responsive to the dynamic nature of this business area. It is the
synthesis of a detailed methodology for developing data warehouses and a methodology for
doing so on an incremental basis.
This generates early value for the business during the implementation process while ensuring the
quality of the overall implementation effort. The foundation is the set of Data Warehouse
Development Processes, which are summarized immediately below. An overview of The
Incremental Approach and then of the resulting GRT DW/BI Development Approach itself
follow that discussion.
b:
The sections below provide perspective on each of the critical processes in a data warehouse
development effort. In addition to the details discussed in each section, numerous control
mechanisms and management techniques to facilitate the success of the overall project support
each process.
primary responsibilities: DATA QUALITY , WAREHOUSE
ADMINISTRATION,METADATA MANAGEMENT,DATA ACCESS,DATABASE DESIGN
AND BUILD,TESTING,TRAINING ,TRANSITION ,and POST-IMPLEMENTATION.
c: The Business Requirements Definition process defines the requirements, clarifies the scope,
and establishes the implementation roadmap for the data warehouse. With the direction of the
client organization, strategic business goals and initiatives are established and used to direct the
strategies, purpose and goals for each phase of the data warehouse solution.
Definition focuses on determining the specifics of the solution to be developed and delivered,
identifying the clients information needs, and modeling the requirements.
The objective of the Data Acquisition process is to identify, extract, transform, and transport the
various source data necessary for the operation of the data warehouse. Data Acquisition is
performed between several components of the warehouse, including operational and external
data sources to data warehouse, data warehouse to data mart, and data mart to individual marts.
The Data Acquisition Strategy is also developed to outline the approach for extraction,
transformation, and transportation of the source data to the data warehouse. The strategy includes
selecting a tool or set of tools as the data pump or defining the specifications of one that must be
built. If tools are to be utilized, high-level tool requirements, tool evaluations, and tool
recommendations are also addressed.
. Detailed analysis is performed on the data sources and a mapping is created between the current
state of the source data and the new set of objects that define the data warehouse. With the
mapping, a gap analysis is produced to validate that the information requirements can be met
with the available data. With the detailed analysi.
This document provides a summary of William McGrath's experience as an IT professional with over 26 years of experience in storage architecture, implementation, and maintenance. He has expertise in technologies from vendors such as EMC, NetApp, Cisco, HP, Oracle, and more. His most recent role involved architecting and implementing a hybrid cloud infrastructure delivering storage as a service.
Achieve New Heights with Modern AnalyticsSense Corp
油
Businesses can leverage modern cloud platforms and practices for net-new solutions and to enhance existing capabilities, resulting in an upgrade in quality, increased speed-to-market, global deployment capability at scale, and improved cost transparency.
In this webinar, Josh Rachner, data practice lead at Sense Corp, will help prepare you for your analytics transformation and explore how to make the most on new platforms by:
Building a strong understanding of the rise, value, and direction of cloud analytics
Exploring the difference between modern and legacy systems, the Big Three technologies, and different implementation scenarios
Sharing the nine things you need to know as you reach for the clouds
Youll leave with our pre-flight checklist to ensure your organization will achieve new heights.
This document discusses Oracle's value proposition for its big data solutions. Key points include:
- Oracle offers engineered systems that integrate hardware and software to securely manage both new and existing data types and formats for big data.
- The solutions allow customers to acquire, organize, analyze and make decisions from big data to develop predictive analytics and gain competitive advantages.
- Oracle partners with other companies through its Oracle Partner Network to increase sales and empower partners with Oracle resources and specializations.
- Oracle solutions serve many customer segments including telecommunications, energy, life sciences, healthcare, oil and gas, manufacturing, and retail.
UNIVERGE BLUE Backup as a Service Case Study: B & AInteractiveNEC
油
This case study describes how B&A, a government IT solutions provider, implemented a Backup as a Service (BaaS) solution from NEC to simplify their backup processes and improve data security and compliance. Key points:
- B&A previously had complex, manual backup processes across multiple providers that lacked visibility and compliance.
- NEC designed a BaaS solution using their HYDRAstor technology and hosting backups in Iron Mountain's secure data centers.
- This simplified B&A's backups, provided full visibility and verification of backups, and ensured compliance with government standards.
- The new automated solution has reduced costs and freed up B&A's IT staff to focus on other tasks
SaaS Application Scalability: Best Practices from Architecture to Cloud Infra...riyak40
油
By crafting a versatile and modular architecture, adopting microservices, and integrating robust load balancing, to leveraging auto-scaling, monitoring, and statelessness, every phase of development presents an opportunity to build a more efficient, responsive, and resilient application.
The document discusses planning technical architecture for the ServiceNow platform, including instances, integrations, and data flows. It emphasizes that architecture decisions should be driven first by business objectives and context. The key steps outlined are to: 1) develop a clear understanding of architectural needs, 2) define the instance and data architecture, 3) define the integration architecture, 4) manage the ServiceNow architecture, and 5) plan for expansion.
Software-defined storage (SDS) provides storage software that runs on standard server hardware to deliver data services. The document discusses the top five use cases and benefits of SDS, including reducing storage costs through scalable commodity hardware, improving performance by optimizing storage I/O, better provisioning and automation of storage resources, robust management of heterogeneous storage arrays, and tightly aligning storage with broader infrastructure management. SDS can lower costs while improving performance, efficiency, and flexibility compared to proprietary storage systems. However, SDS also presents challenges around integration, support skills, and interoperability that must be addressed.
Christopher King is an award-winning systems architect and storage engineer with over 9 years of experience integrating business needs with cost-effective technology solutions. He has extensive expertise in data growth and protection, disaster recovery design, and virtualization. King has a proven track record of managing large-scale projects and IT budgets up to $1 million. His experience includes roles at Yahoo!, Georgetown University, American Chemical Society, and The George Washington University.
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business youre in, youre in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the fourth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
1. www.mainline.com | 866.490.MAIN(6246)
DATA STORAGE AND PROTECTION
ARCHITECTURE DESIGN
Assumptions are good! They are what
guide the design of your storage architecture
Mainline Information Systems | Storage and Data Services Practice
Making uneducated, unexplored and unconfirmed assumptions in life is not advised, but when building
and documenting your experience-based IT assumptions, it is critical to ensure the validation of a defined
architecture.
At Mainline, we document your existing storage architecture and then facilitate the building of your planned architecture by defining a framework and a series of
assumptions to guide the crafting of that architecture. Once validated, these assumptions become the detailed, low-level rules or architectural decisions and guidance
used to align your SAN, storage arrays, backup/restore and archive infrastructure with future growth requirements in conjunction with your strategic storage mission.
The results of this architectural definition are part of a larger best-practices effort typically defined as part of the storage strategy. Architectural definition is a process
and as a process, it requires a definition that is documented, published, maintained and enforced.
It is not just about painting a pretty picture of your architecture
While not taking for granted the value of visualizing an architecture, the deeper business value of an architectural definition is about operational efficiency and the
healthy questions involved in building a process around who owns and administers that process. Is that owner the IT director/manager? Who is the administrator? Is it
someone on the team? Is it a new hire? Is it someone on the existing or future engineering team? Once the owner and administrator are established, what do they do
to maintain and enforce that architecture standard?
If we were just painting a picture, in six months that picture would become invalid. Even if we documented the assumptions, the same is true for them. The overall
architecture definition and detailed implementation of it must be maintained as part of regular business. If service classes are defined, each class has an architecture
to be defined. If data placement, retention and residency were defined per class, they get included as part of that architecture definition. As the SAN switch
infrastructure grows, those new devices get added to both the physical and logical diagram. Without better tools, when a new LUN is allocated, that LUN is manually
documented on a detailed physical array diagram. Keeping an updated definition is critical when managing problem and change. When making procurement decisions,
issuing RFPs, and making tactical and strategic decisions to expand that solution, your storage architecture is the on the ground extension of your storage strategy.
What is included as part of this offering?
Mainline brings storage technical and business expertise and conducts a storage architecture workshop meant to define the data storage, protection and archive
infrastructure, including up to two sites, the SAN, inter-site telecommunication links, fibre channel, iSCSI and IP arrays (NAS), archive solutions (CAS), tape library
solutions, and any storage cloud solutions and services. Architectural decisions will be defined for the above, with assumptions about of structured and unstructured
growth and what the future-state logical architecture should look like. Detailed physical diagram requirements will be identified, whether rack-level views or data-
center-level views, to define a future SAN switch-cabling diagram or how the equipment would be placed in the racks. Assumptions about the rules for building racks,
placement on the floor, cabling rules, use of patch panels, etc. all get defined as it makes sense for your infrastructure. You will validate this documentation, and a final
presentation will be given back to your team defining the existing architecture, the future architecture and those related architectural decisions used to determine that
architecture, including an estimated view of the logical diagram at 12 months and 36 months.
2. www.mainline.com | 866.490.MAIN(6246)
Data Storage and Protection Architecture Design is part of Mainlines larger storage assessment
Mainline Information Systems | Storage and Data Services Practice
methodology
Mainlines Storage Assessment methodology
consists of ten service areas that can be
delivered as a whole to exploit the inherent
synergy, or they can be delivered as stand-
alone services, depending on where you are
in the storage transformation journey.
Data Storage and Protection
Architecture Design provides
additional value when delivered with
the following services within the
methodology...
Enterprise Storage Assessment with
Data Storage and Protection Architecture
Design takes the current and target
environment described in the Enterprise
Storage Assessment and uses it as the
starting point to converge on the future
architecture within the design session.
Infrastructure Application Landscaping
with Data Storage and Architecture
Protection Design provides detailed
application-level requirements for storage
infrastructure, providing a framework for
architectural elements required to address
each service class.
Infrastructure Data Analysis will provide
specific storage tier and backup retention
policy recommendations that will drive
growth estimates by tier over time.
EXPERTISE YOU CAN TRUST
Eighty-five storage experts skilled in storage solutions from every major vendor
Decades of industry expertise in designing, implementing and optimizing storage
solutions for environments of all sizes
Services covering product implementations, complex data migrations, information
lifecycle management, storage assessments, and advanced archiving and
protection strategies
Residencies and managed storage services to improve storage operations and
reduce operating cost
Next Steps:
Contact your Account Executive, or you can reach us at
StorageServices@mainline.com.
For more information on our storage services, go to
http://mainline.com/storage-transformation.
10110100
10010001
STORAGE STRATEGY
WORKSHOP
A facilitated and managed approach to
defining infrastructure strategy.
INFRASTRUCTURE
APPLICATION
LANDSCAPING
Facilitate infrastructure and
application teams working together to
build a score card, score application
requirements and agree on the future
onboarding process.
ENTERPRISE STORAGE
ASSESSMENT
Define current and target data storage
and protection environment, identify the
gaps, and provide an actionable
roadmap of recommendations. The
core of any storage assessment.
DATA STORAGE AND
PROTECTION ARCHITECTURE
DESIGN
Define and illustrate both your current and future
environment, including both logical and physical
diagram modeling growth over 12, 24 and 36
months by data center.
STORAGE INVESTMENT ANALYSIS
Define and enable a loaded unit cost of storage
supporting showbacks/chargebacks through the
development of TCO, ROI and CBA.
SERVICE READINESS
ASSESSMENT
Use Mainlines scoring tool to establish a service
delivery baseline and compare against your industry
average. Use to measure improvement.
SERVICE CATALOG DESIGN
Organize your infrastructure services into a catalog of
your choosing based on strategy, data requirements
and agreements with application teams by service
class, or build a starting point to update as you
learn more.
INFRASTRUCTURE DATA
ANALYSIS
Using an agentless, cloud-based
Storage Resource Management tool,
define valid and invalid data,
recommend residency and retention
policies, and classify application
data/servers into service tiers.
PROCESS ENGINEERING
Document and detail ITIL-based
process definitions within an
IT-based process reference model,
starting with storage and backup
processes and building an operational
guide delivering auditable, repeatable
one-way-to-implement infrastructure.
INFRASTRUCTURE TRANSITION
PLANNING
For any large transformation or migration
project, define a project plan to be
delivered by your team, a third party,
Mainline or some combination.