Wednesday, January 16, 2008

A Data Protection Strategy for Today's World

A Data Protection Strategy for Today's World


By Bob Baird

Today's enterprises face new levels of risk to their IT operations. Business services can be disrupted by anything from ordinary operator error to natural disasters and physical corruption. At the same time, evolving legal demands are driving enterprises to address increasingly complex and stringent operational conditions-even as they are tasked with protecting more and more data.

Data protection is paramount to prepare for and recover from a data emergency, which is typically corruption and damage resulting from those operational mishaps or disastrous events. Technologies such as backup, monitoring, and replication contribute to recovery, but they are only part of the solution. A complete data protection solution includes best practices, services, and technology. Data protection is the underlying foundation for disaster recovery and high availability.

Today's Data Protection Challenge

Data protection today faces a different set of challenges than the solutions of the past. At one time, online production operations ran during first shift; batch update and reporting operations ran during second shift; and backup and maintenance operations ran during third shift. However, as more applications share the same body of data for different purposes and from locations scattered across time zones, off-hours is becoming a vestige of the past. Yesterday's data protection approaches can only partially solve today's problems and will not solve tomorrow's problems.

Today's backup window is quickly disappearing, a symptom of a business environment that demands greater data availability. New approaches must be adopted that do not contend with production workloads for resources and are not bound by narrow backup windows.

Furthermore, the volume of data requiring protection continues to grow while recovery demands become more aggressive. Growth in the amount of data requiring protection is driven by two factors: 1. the volume of data continues to grow 30 to 60 percent per year for most large enterprises, and 2. new regulations require companies to retain more data for longer periods of time.

The 'Do Nothing' Option

A comprehensive data protection solution requires planning, and could require staffing and budgetary resources. However, in order to evolve and meet demands, IT cannot ignore these new requirements.

It is true that not all disruptive events have severe, immediate consequences, but frequent small losses and excess IT costs will drain company profits over time. IT administrators can recognize a need for an update data protection solution by looking for a few symptoms that might signal data protection problems, including

· Excessive data recovery incidents caused by operational mishaps
· Excessive time elapses before recovery can begin
· Compounding of data recovery problems due to operational mistakes
· Resolution of data protection problems drag on for weeks to months
· Root-cause analysis takes too long or is not done at all
· Testing of data recovery procedures known or perceived to be a potential disaster
· Patch levels are far out-of-date or data protection products are out-of-date by more than two release levels
· Backup jobs frequently delayed until the next available backup window
· Inadequate or nonexistent recovery for database severs, file servers, and mail servers
· Difficult to recover data protection infrastructure in a disaster

Anatomy of a Comprehensive Data Protection Solution

Data protection has both technical and non-technical aspects and should include people, process, and technology products and services. Although technical aspects are of major importance, people and processes are of equal importance. The technical aspects of data protection refer to solution design, implementation, and operational tasks. People and process refer to planning, best practices, and ongoing testing. A comprehensive data protection solution combines technology and services into a cost-effective solution with the following benefits:

· Reduces the amount of application downtime caused by data emergencies
· Meets recovery objectives that support even the most critical data
· Cost-effectively backs up and retains massive amounts of non-critical data
· Raises backup and recovery success rates well above industry standards
· Mitigates constraints imposed by tight or disappearing backup windows
· Proactively prepares for operational mishaps and disastrous events
· Minimizes the gap between the current environment and a state-of-the-art environment
· Mitigates the attrition/loss of skilled data protection professionals
· Makes the overall cost of disaster recovery and replication more affordable
· Minimizes and manages IT operational risks

The technology goal is to design, implement, and maintain a state-of-the-art data protection environment. There are a number of considerations for administrators who are creating a strategy for comprehensive data protection, including the following:

· Recovery window: Does the solution protect data in a manner that enables recovery within a time window specified by the recovery time objective (RTO) and with no more data loss than specified by the recovery point objective (RPO)? The solution might employ various combinations of replication and backup to this end.
· Comprehensive data support: Does the solution support recovery of all classes of data (e.g., database, flat files, and email)? Recovery should mean restoring data to its normal operational state and verifying success.
· Cost effectiveness for data classes: Does the solution employ the most cost-effective protection methodology for each class of data without compromising recovery objectives? For example, flat files with very relaxed recovery objectives should be handled differently than mission-critical databases.
· Server recovery: Does the solution recover database servers, file servers, and mail servers to an operational state? Recovery is only deemed successful when data can be safely accessed by applications through a data server.
· Automated recovery: Does the strategy include automated recovery, but still leave critical recovery decisions to IT staff? Automation reduces the incidence of operational mishaps and prevents problems from compounding during stressful recovery situations.
· Planning for conflicts: A good data protection plan will take into consideration possible contention with applications for server, network, and storage resources. The potential that backup jobs might disrupt production workloads forces backup operations to be performed during off-hour backup windows. Avoiding or eliminating backup windows increases backup success rates and provides flexibility in backup schedules.
· Tape drive use: Administrators should use tape media and tape drives efficiently to contain costs. Costs associated with tapes are a major factor in a backup solution.
· Effective retrieval: Administrators should implement a strategy to reduce time to retrieve backup media from offline storage. Delays associated with retrieving backup media can add many hours to end-to-end recovery time.
· Network and replication use: IT should use the replication network wisely to contain network costs. Studies and experience have shown that the network is a major cost factor in employing a replication solution.
· Staffing considerations: A good data protection plan will require no more than a skeletal onsite staff for data protection operations. Eliminating the need for a large staff at production locations saves onsite staffing costs, leverages offsite services for multiple production locations, mitigates the impact of a local area disaster, and increases outsourcing opportunities.

Conclusion

There is no single technical approach that delivers both aggressive recovery and meets low cost objectives. Any comprehensive solution must be a combination of a number of strategies-each addressing part of the total data protection problem. When budgeting for a data protection strategy, administrators should note that the cost of the solution should include the people, the technology, as well as the costs imposed by the technology on the underlying IT infrastructure such as servers, storage, and networks.

IT organizations should consider the future of data protection and how they will adapt their current strategy to meet today's data protection demands and tomorrow's requirements. Ignoring data protection problems can have severe consequences for any organization and a comprehensive approach including a combination of people, process, and technology will help any organization prepare for anything from a small incident to a major disaster.

Bob Baird is senior solutions architect with Symantec's Global Services organization. He has 40 years of experience as an architect and consultant with IBM, HP, and Symantec.
www.symantec.com

The Optimal Backup Solution

The Optimal Backup Solution
It's now within your reach

By Jim McKinstry

A very common problem in the data center is backing up data in a timely manner. IT managers struggle with the constantly shrinking backup window while their data continues to grow at astronomical rates. In today's world, there is no need to suffer with clunky, multi-hour backup windows; SAN-enabled backups can dramatically reduce or completely eliminate the amount of time needed to backup data.

Traditionally, each server that required backup had a tape drive or small library attached. This solution allowed for fast backups but was expensive to deploy and complex to manage. Over time, companies deployed large backup servers with large libraries attached and performed backups over the LAN. These solutions are easily able to grow as more data is brought online. Unfortunately, with growth they become more expensive and more complex to manage.

Enter the SAN
Servers attached to a SAN can quickly (and cost effectively) take advantage of the SAN to backup data. By attaching backup servers to the SAN and enabling "IP-over-Fibre" for them, the SAN-attached servers will allow backup traffic that used to travel over the LAN to travel over the SAN using IP. Changes to the backup server will be needed but are minor. For companies that are still using 10/100 networks for backups, the performance increase will be dramatic. Companies using Gig-E for backups may see a performance increase, but will see less traffic flowing over the production network during backups.

SAN and Tape
Adding tape libraries to a SAN allows users to address backup issues in a variety of ways. For many companies, it is difficult to efficiently make use of their tape libraries. Some libraries are constantly busy while others are idle. Adding the backup servers and tape libraries to a SAN gives administrators the flexibility to configure their backups to use the resources efficiently. Many backup servers can utilize large libraries attached to a SAN instead of dedicating a large (expensive) backup server directly to a large library. Also, many smaller libraries can easily be consolidated into a large, SAN-attached library.

For many backup environments, the bottleneck is the network. Once a library is attached to a SAN, every server attached to the SAN has access to it. Most backup software will allow a server attached to the SAN to back itself up directly to the SAN-attached library. These servers will backup at tape speeds since they will no longer send their data over the network. Since servers attached to the SAN usually have a large amount of data, their backups usually occupy a large amount of the backup servers' time. By having the application servers back themselves up, not only will their backups finish quicker, resources on the backup servers will be free to backup the rest of the environment faster.

Disk-to-Disk-to-Tape
The next logical step in leveraging the SAN to solve the backup problems is to consider backing up production data to SAN-attached disk. This allows backups to finish very quickly, usually faster than backing up to tape and more importantly, data recovery is very fast. The biggest drawback of disk-based backup is that there is no tape that can be stored offsite. This is solved by copying the data to tape after the backups are complete. Besides the raw performance increases it provides, disk is also faster than tape because the disk-based system does not have to physically find a tape in the library, load it, and advance the tape to the data. With disk-based backup systems, "tapes" are loaded instantly and access to data, anywhere on the "tape", is instantaneous. There are many ways to implement a disk-based backup solution.

Virtual Libraries
Vendors are starting to introduce libraries that use inexpensive ATA disks with a virtualization engine front end, which emulates a tape library to the backup software running on the backup server. This type of disk-based library typically emulates a library with many tape drives and tape slots. There is nothing special that needs to be done to the backup software; it simply sees another multi-drive/multi-slot library that it can use. Attaching one of these virtual libraries to a SAN allows them to be used in the same manner as the SAN-attached tape libraries described above.

Some backup software vendors have an option to allow for backing up directly to disk. With the same benefits of the hardware-based systems, the software is also more cost effective and has the flexibility to use any available type of disk storage. While the hardware implementations limit users to the amount of backup data that can be stored on a unit, the software version lets them use as much disk as they'd like, enabling them to store huge amounts of backup data near line. Using SAN-attached disk allows for virtually unlimited capacity that can be allocated for backups.

Snapshots
The ultimate in backup technology is the snapshot. A snapshot is, after all, just a copy of data, which is exactly what a backup is. Snapshots are considered a disk-based backup solution because they are stored on disk. During a backup, the applications on the servers being backed up are either in degraded mode (e.g., a database is put into hot backup mode) or are shut down (called a cold backup). With a library (tape or disk), the applications are degraded or down for a large amount of time, frequently for several hours. With snapshot technology, applications are impacted only for seconds or minutes. Usually, it takes longer for the application to shut down or a database to enter hot-backup mode than it takes to perform the snapshot. Snapshots are powerful because a system can be backed up in seconds, then archived to tape at any time. With a SAN-attached tape library, backup performance would increase by allowing SAN-attached application servers to back themselves up directly to the library. With a snapshot, it can be mounted to a SAN-attached backup server and archived to tape with virtually no overhead or impact to the application server.

There are a variety of ways to perform snapshots:

Split-mirror snapshots create an instantaneous second copy of data that can be used at a later time to recover data; this is accomplished by mirroring disk drives and then breaking the mirror. It is traditionally done within the disk subsystem or from the host using some sort of volume management software, such as VERITAS Volume Manager. Today, SAN appliances like FalconStor's IPStor also perform this function. The major issue with a split-mirror snapshot is that there must be enough storage to accommodate not only the original copy but the split mirrors as well. Assuming that a snapshot is taken every hour and held for 24 hours, a database with one terabyte of disk space allocated to it would need 25 terabytes of usable disk space (production copy plus 24 snapshots). By retaining multiple split-mirror snapshots, the amount of time to recover will be much less than going to a backup system to recover the data.

Snap-copy snapshots, like split-mirror snapshots, provide a complete copy of the data. The major difference is that when a split mirror is initiated, it creates an instantaneous copy of the data, so the data is available immediately. When a snap-copy is initiated, the data is copied to another area of storage, which may take from a few minutes to hours. As with a split mirror, each snap-copy snapshot requires enough storage to hold an exact copy of the original data and can be created on the host (Volume Manager), within a SAN appliance (IPStor) or on the disk subsystem using, for example, Engenio's SANtricity Storage Manager software.

Pointer-based snapshots are not exact copies of the data but a set of pointers that point to the original data. As the original data is written to, the changed blocks are written to the snapshot reserve area and the pointer moved to that block; this process is called "copy on write." Subsequent writes to the original data are not copied to the snapshot reserve area because the original data has already been moved. One of the most attractive aspects of a pointer-based snapshot is that the snapshot reserve area needs just a fraction of the original disk space, since only the changed blocks are copied. Because pointer-based snapshots require such a small amount of space, they can be taken more frequently at a low cost. Pointer-based snapshots are the most robust of the snapshot technologies and can be created on the host (Volume Manager), within a SAN appliance (IPStor) or on the disk subsystem (SANtricity).

Optimal Solution
The optimal solution, which solves a majority of most companies' backup issues, is to implement a SAN with Fibre Channel disk with snapshot technology for production, ATA for disk-to-disk backup and a tape library at a remote location attached via a stretched SAN. A backup would consist of a snapshot of the production data, which would be mounted to a backup server. The backup server would then backup the snapshot to the ATA disk. That backup would then be archived to tape. The copy of the backup that resides on the ATA disk would "age" using the same retention policies that were in place for the local tape copy. The backup that was archived to tape can remain at the remote site and fulfill most companies' requirements for offsite tape storage.

Conclusion
Historically, solving backup needs was an easy task. It was just a function of the backup window, the amount of data to be backed up, and the speed of the backup drives to calculate how many drives were needed. For some, this is still the preferred method. For others, the backup window may be so small, or their environment so complex, they will need to implement some sort of SAN enabled technology. If the need is for fast backups and a lot of random restores, then a SAN disk-based backup solution may be in order. Many companies may implement a variety of solutions (e.g., adding a disk-based backup unit or a snapshot solution that extends the useful life of an existing tape-based library). Whatever the need-backup, restore or archive-there is a SAN-enabled solution on the market to address it.

Jim McKinstry is senior systems engineer at Engenio Information Technologies, Inc.
(Milpitas, CA)
www.engenio.com

CDP Makes Backup Better, Faster, Cheaper

Continuous Data Protection Makes Backup Better, Faster, Cheaper
By Eran Farajun

The data backup world has changed dramatically in recent years. No change has been more dramatic or rapid than the shift from traditional tape-based backup technology to disk-to-disk (D2D) backup. Disk-based backup has enabled shorter backup windows and more rapid data recovery which has opened the way for more sophisticated backup and recovery software technologies that were not possible with tape backup systems. Software vendors have responded to the technology potential of disk-based backup with new enhanced functionality, such as point-in-time snapshots and local and remote replication in an effort to reduce the vulnerability of data loss in between scheduled backup sessions.

Beyond the pure speed advantages, disk backup is also the right technology at the right time to address the convergence of two business trends: the necessity for 24/7 data access in a global wired economy and the increasing use and importance of remote offices. According to the Enterprise Strategy Group an estimated 60 percent to 70 percent of mission-critical data is stored and used at offsite locations. Enterprise IT managers face the challenge of how to protect and manage all remote data in an era of tight budget constraints and the reality that the geographically distributed locations typically lack the IT staff to manage, monitor and verify backup operations.

Continuous Data Protection (CDP) Gains Momentum

CDP is the disk-based backup and recovery strategy gaining traction in data centers of various sizes. The traction is especially visible among users of Exchange, where the management and compliance challenges are driving elements of the CDP marketplace.

A CDP product is one that will continuously monitor an object for changes and will preserve copies of all prior versions of the object. The user will have the ability to view and access these prior versions, as required. The time to perform recovery changes is shifted from hours or days to seconds or minutes. The backup window is no longer a problem because there is no longer the concept of a backup window.

CDP is a cross between disk-based backup and replication. CDP continually captures all changes made to a file, and engages in tagging (versioning) objects so that they can be specifically rolled back to a particular point in time. The business value of CDP lies in the ability to restore data objects to a point before a data corruption or interruption event takes place. CDP protects/captures data as it is written to disk. One of the great myths of CDP is the unspoken assertion that CDP is for every kind of data, all the time. This is of course untrue, since the value of data changes as a matter of time, urgency and business dynamics.

One important scenario to keep in mind when considering the implementation of CDP is that of centralized backup for the remote or branch office. Too often, basic IT tasks like monitoring the backup server and changing tapes can be missed when assigned to remote office clerical staff not skilled in IT. Using a CDP strategy over the WAN to protect branch office file servers removes the requirement for tape drive and media handling at the remote site.

What about Recovery?

There are two general principles that govern all recovery policy-making: the recovery point objective (RPO) and the recovery time objective (RTO). The RPO defines how much data you are willing to lose when you recover data.

The RTO defines how long it will take to recover your business processes from a data failure. This includes not only the data recovery, but restarting the servers or applications that depend on that data. These recovery considerations must also be applied to local and remote recovery strategies.

A true CDP product protects every data change as it takes place, and the RPO approaches zero. On the other hand, with the vast amount of data being recoverable, how you choose the recovery point effects your RTO.

Some recovery points are based on time, a particular hour or minute. More useful, however, they are event-based. Since every data change is protected, a loss event can be absorbed and yield a recovery event.

Implementing CDP

CDP solutions are designed to be block-based, file-based or application-based. Block- and file-based CDP solutions have the advantage of functioning with a range of different applications, while application-based CDP is optimized and tightly integrated with a specific application, such as Microsoft Exchange. Potential CDP buyers should also be aware of the level of recovery granularity a particular CDP solution provides, as all CDP products are not created equal on this issue. Some products only support recovery of servers, volumes or folders and lack the granularity to recover a single file or email message.

CDP is deployed most frequently as an appliance or as a software solution running on a server or switch with agents. A dedicated CDP appliance can deliver good performance without impacting application servers, but the hardware can be extremely high, especially when an enterprise needs to scale its CDP capabilities and add more appliances. The software solutions are billed using various licensing strategies, frequently per server. The software solutions also involve agents that must reside on each server to be protected. The more servers a user has, the more agents that have to be purchased and managed…a stumbling block to the SMB, a potential struggle for the enterprise, that might have to manage agents on hundreds (or thousands) of servers. There is a significantly better way.

Host-based CDP software eliminates the hardware expense of a CDP appliance but comes with its own set of cost and complexity issues. The software solutions require that agents be installed on each server to be protected, creating management overhead and additional costs. The pricing model for this type of CDP is typical of most enterprise backup software that charges a license fee for each server or database that is protected, regardless of how often the CDP functionality is actually used by each server.

The third CDP architectural alternative is to simply incorporate the CDP functionality as a feature in a full-featured backup and recovery software suite, which has proven itself to be the simplest, cost-effective and most practical approach to CDP.

CDP as a Feature

CDP as a feature should be designed as a remote office CDP with the capability to work over a WAN. The CDP functionality should simply be integrated as a feature with no additional cost to customers or separate CDP application or appliance to purchase.

Apart from offering the CDP functionality in the software, CDP as a feature should include a robust feature set with retention policy management and the ability to perform data restores without interrupting CDP backups.

CDP as a feature should include a two-stage continuous backup. Backup starts with a change event, and granularity is available to the pace at which data is written to disk in a consistent state. Local servers aggregate the changes, deduplicate, compress and encrypt.

Having an agentless nature is significant. The disadvantage to an agent-based architecture is that you have to manage agents installed on perhaps hundreds or even thousands of client machines. Agentless solutions do not require another application running in the background greedily consuming IT resources, such as memory and CPU cycle time, or prone to being used in a hack.

Selecting a CDP Product at a Glance:

One way to determine if CDP as a feature is right for your remote or branch office location you have to ask yourself a set of qualifying questions, such as: Are you worried about meeting remote site service level agreements (SLAs) established by the CIO? Do you need to measure the business impact of downtime at remote sites? Do you have rapidly changing data that is critical to business operations? Are you worried about shrinking backup windows to protect that data? If you answered yes to one or more of these questions, you should be seriously investigating CDP technology for your remote sites.

There are several features a robust CDP product brings to the market. These include:

* Support of heterogeneous storage and server environments. Today's customers are refusing to be locked into a single vendor for their storage and server solution. Users should select a CDP product that doesn't restrict them to only a subset of their possible storage and server environments.
* Awareness of applications and their environments. Application recovery is becoming more complex and time consuming, users should chose a product that integrates application specifics into the CDP recovery process.
* Non-invasive to the application or server that is being protected. A CDP product should attempt to minimize any impact to the application’s I/O throughput or CPU load. This is best done by keeping the CDP footprint on the application server to a minimum, and moving any 'heavy-lifting' to an external server or appliance.
* Built on a scalable, reliable platform. If the CDP product is hosted on an appliance platform, the user should have the ability to add additional appliances that can scale their CDP capacity as data protection needs grow.
* Supports a federated application environment. Many of today's complex applications (such as SAP R3) utilize servers and storage that span multiple hosts. Customers should choose a CDP product that supports these systems, as it provides the user with a consistent, federated image for recovery.
* Supports business policies and SLAs. Companies assign different values to their different applications. A CDP product that is flexible in its support of differing protection and recovery policies can provide a better overall solution.
* Can be extended. Look for a CDP product that has functionality that can be easily extended by the customer to meet business needs.
* Tightly integrated with business continuity technologies. A CDP product that supports application clusters and remote replication provides a stronger solution than a CDP product that only provides a stand-alone solution.

Conclusion

Continuous Data Protection is the latest piece in an enterprise’s data protection arsenal. The CDP model has been redefined and the cost of CDP deployment has been driven to zero by integrating CDP as a feature in a distributed backup software platform. CDP is not a replacement for other data protection technology. Instead, it complements existing backup, replication and snapshot technology to bring advanced backup and recovery capabilities to improve the protection of customer data.

Eran Farajun is executive vice president at Asigra.
www.asigra.com

How To Do Data Recovery, Data Security, Data Backup The Right Way

How To Do Data Recovery, Data Security, Data Backup The Right Way
By Jon Arnold

One of the most frightening things that can happen to a person is to lose the data off of their hard drive. Many of us store personal and business information on our computers. The thought of losing data due to a crashed or failed hard drive or perhaps a breach of data security sends chills down the spine of every grown man and woman. Once you except that you did not back up your data you need begin the data recovery process.

What exactly are your data recovery options once your hard drive crashes? First of all, stop using the computer immediately. It is not suggested that you run any data recovery software that came with your computer as this software can overwrite the original files. The next step is to allow a hard disk recovery company to restore your pertinent files. Contact a reputable company immediately. There are a few companies who come right to your home, but at-home data recovery computer services charge more. Certain computer files, such as DWG files (a complex graphics file format), require a specialist in order to be recovered. If there is damage to the hard drive you may not be able to recover your files. A hard drive crashing is an instance where data recovery is possible. If your data security is breached you may never recover what is stolen.

The best way to avoid compromised data security is to put safeguards in place. A few data security measures are encryption, antivirus and firewall protection. Data security such as encryption translates data into a secret code. To read an encrypted file you must have access to the key or password that enables you to decrypt it. There are many software programs and services that provide data encryption services depending on your data security needs. Additional data security measures such as antivirus and firewall protection provide further data protection. Some software programs have the ability to provide both antivirus and firewall protection for overall data security.

One way to avoid the mess of data recovery is to perform regular backups. Basic computer maintenance includes full and incremental backups. It is recommended that you perform a full backup once a week. An incremental backup is a backup that backs up only the files modified since the last backup and depending on your needs you can schedule this to run every day. It’s also beneficial if you have a copy of your data offsite. When you need to access this data you can either open the offsite data program and run it, or log on via the web to access your data. You will want to check with your service provider how to perform data recovery if needed.

Data recovery either from a crashed hard drive or lost through compromised data security can be a frustrating and devastating experience. The best situation is to avoid the loss of data all together. With regular computer backups and data security measures in place your computer data will remain where it needs to be, on your hard drive and easily accessible.

Some people balk at the cost of protecting their data, as well as the time involved in doing secure backups. But what they fail to consider is the cost of their lost personal and productivity time that is spent trying to recover data that could be recovered in literally MINUTES if they had properly safe-guarded the data in the first place! If your computer data files are the lifeblood of your business and/or personal life, the time and money involved in protecting that data adequately and properly cannot be determined by a dollar value.

Jon is a computer engineer who maintains many websites to pass along his knowledge and findings. You can read more about data recovery, data backup and data security at his web site at http://all-about-data.com/

Article Source: http://EzineArticles.com/?expert=Jon_Arnold http://EzineArticles.com/?How-To-Do-Data-Recovery,-Data-Security,-Data-Backup-The-Right-Way&id=355860

Data Protection Strategies: Are You Using these Top 5 Strategies to Protect Your Data & Business?

Data Protection Strategies: Are You Using these Top 5 Strategies to Protect Your Data & Business?
Tim Rhodes

We lock our businesses, our homes and our cars to restrict wrongful entry and burglary. We invest heavily in security systems to deter and prevent loss. But how can we similarly protect intellectual property?

A recent Trends in Proprietary Information Loss, Survey Report sponsored by Pricewaterhouse Coopers, revealed that the U.S. Chamber of Commerce and the ASIS Foundation found that both Fortune 1,000 and small to mid-sized businesses, were likely to experienced proprietary information and intellectual property losses ranging from $53 and $59 billion. These losses involved:

- R&D (49%)

- Customer lists and related data (36%)

- Financial data (27%)

How would your organization react, and what would it lose, given this type of exposure?

No company is 100 percent safe, whether storing information electronically or in paper file cabinets.

Data loss occurs every day through various channels including current and former employees, competitors and on-site contractors. But just as devastating, if not worse, are the uncontrollable effects of civil unrest and natural disaster like earthquake, flood and fire. While there's no way to stop the sharing of information in business, just as it's not feasible or realistic to lock physical doors and windows of your business establishment 24/7, there are measures you can take to minimize your risk.

Here are my top 5 ways to protect your data:

1. Assess Your Inventory and Risk

Conduct a comprehensive inventory of your business information. Catalog electronic data and identify type and purpose. Once cataloged, rate the risk of each based on its importance to the organization's ongoing operations.

For example:

- Is the information essential to mission-critical business functions (such as payroll, banking or legal documentation),

- Is the information required for business continuity but not detrimental should systems falter for a brief period of time (such as email), or deferrable should systems falter for an extended period of time (such as past employee or past customer archives)?

- Is the data sensitive or unrestricted?

Once a comprehensive inventory is in place along with risk-ratings for each category, management can quickly record and assess risk at file creation.

2. Implement New Policies

Implement new policies defining procedures for security breach, system failure or threat. This should include remediation and reporting strategies. All businesses should also have a confidentiality policy signed by all employees. This policy should outline employee responsibility, information use and disclosure practices.

3. Access Controls and Authorization

When dealing with sensitive information, have processes in place restricting physical and/or electronic access. This might include keyed or coded entry for paper or password restriction for electronic files or folders, firewalls and program encryption. Organizations should also require employees to use shredders when destroying confidential documents.

4. Ongoing Communication

Sharing information is a natural instinct among social groups and communities. This means, you should continually communicate with your employees, sub-contractors and consultants. You want to ensure all parties understand what information is confidential and what their responsibilities are in safeguarding its integrity.

5. Maintain a Clear Accountability Trail

With employees aware of their responsibilities, businesses should hold them accountable for confidentiality leaks and breaches caused by their actions. This means consistent disciplinary action for all individuals violating company policy.

Just as physical security is critical to protecting assets and inventory, businesses must make information security a high priority. Information security should include inventory, valuation, access controls, consistent communication and clear accountability trails. When organizations implement a comprehensive program using each of these five strategies, they're well on their way toward maximum data protection.

Article Source :
http://www.bestmanagementarticles.com
http://data-management.bestmanagementarticles.com

About the Author :
Data Protection Expert, Tim Rhodes has helped hundreds of companies just like yours protect their most valuable asset online. Now, you can discover if you're doing everything you can to prevent information loss with Tim's Free Risk Assessment Quiz. Take the FREE QUIZ now at: http://www.webargos.com/quiz and see if your company is at risk!

Saturday, January 12, 2008

How To Create An Image Using Symantec Ghost, Ghostcast Server?

How To Create An Image Using Symantec Ghost, Ghostcast Server?
By Dave Kierkels

If you want to create an image of your hard disk it can be done in several ways. A way to do this is to boot the computer from a Symantec Ghost floppy disk and to create the image by multicasting, unicasting, disk to disk or peer to peer.

So if you want to create an image the first thing you have to do is to create a Symantec Ghost boot disk. This boot disk doesn't boot to Windows but boots the Symantec Ghost utility in DOS.

How to create an image using GhostCast server?

GhostCast server is a utility of the Symantec Ghost software. Using GhostCast server you can create images of computers in the network or you can restore an image to several connected clients at the same time. The only thing you have to do is to boot all the clients with a Symantec Ghost boot disk, then you accept all the connected clients in the GhostCast server utility and then you simply send the image to all connected clients.

First you have to make sure that the computer of which you want to create an image of, is connected to the network. Of course the GhostCast server computer has to be connected as well.

Boot the client from the Symantec Ghost boot disk.

Now return to your GhostCast server computer and start the GhostCast server utility by clicking Start -> Programs -> Symantec Ghost -> GhostCast server.

Here you have to enter a session name in the Session Name box.

Then you enable the 'Create Image' option to create an image from the clients.

Then you click the 'Browse' button to browse for a place where the image must be stored.

If you want to create an image of the entire hard disk of the client you select the 'Disk' option. Do you want to create an image of a particular partition you select the 'Partition' option. (In this example I describe how to create an image of a disk).

Then you click the 'Accept Clients' button.

Now GhostCast server waits for the clients to connect.

You can see that there are no clients connected until now.

Now you go to the client which is booted in Symantec Ghost and you click (or by keyboard) 'GhostCast' -> 'Unicast'.

You enter the session name exactly the same as given in GhostCast Server.
Leave the option 'Discovery Method' to 'Automatic' and click 'Ok'.

The next thing you do is, you select the drive of which an image must be made of.
and click 'Ok'.

Here you can choose if the image must be compressed to safe disk space. The options are 'No', 'Fast' or 'High' compression.
I recommend you choose 'Fast' compression.

The next thing you need to do is to confirm the image creation by clicking 'Yes'.

Now Symantec Ghost is waiting for you to click the 'Send' button in GhostCast Server.

So, return to the GhostCast Server computer and click the 'Send' button.
After you clicked 'Send' the image will be created and stored at the location you chose in GhostCast Server.

You can see the details in GhostCast Server.

After the GhostCast progress is 100 % your image is created and you can close GhostCast Server.

You have successful created your Ghost image using GhostCast Server!!

Dave is the webmaster of http://www.about-your-computer.com which is made to help people with their computer.

This website is a perfect guide to help you understand how your computer works, a description of the hardware components and how you can setup and configure software using Windows XP and Windows Vista.

Here you can find also many free to download tools.
Furhtermore you can advertise your website very cheap - only $5 a year! http://www.about-your-computer.com

Article Source: http://EzineArticles.com/?expert=Dave_Kierkels http://EzineArticles.com/?How-To-Create-An-Image-Using-Symantec-Ghost,-Ghostcast-Server?&id=862888

Protecting Children Online With Internet Parental Controls

Protecting Children Online With Internet Parental Controls
by: Kelly Hunter


The World Wide Web is a fascinating place. It has obliterated geography in terms of education and business. It facilitates learning by allowing kids to see things and experiences aspects of different places they may never get the chance to see in the non-virtual world. The Internet can bring people together who otherwise would never know each other and create a virtual universe that is totally cohesive, with every kind of information imaginable literally available at your fingertips. Sounds great, doesn’t it?

Unfortunately, the Internet has a dark side. It is full of material that is inappropriate for children and all kinds of predators. Leaving your kids alone to fend for themselves on the Web is exactly as dangerous at leaving them in a crowed airport or shopping mall. You don’t know where they’re going or who with. The news is filled with horror stories about kids who have been taken advantage of on the Internet, but you don’t want yours to miss out on all the positive aspects of the technology. The first line of defense in keeping your kids save on the Web is to teach them how to use it safely.

A lot of online dangers can be dodged simply by reminding kids of one of their earliest learned lessons: don’t talk to strangers. The kinds of people who want to harm kids have all kinds of tricks up their sleeves. They may try to lull your child into a false sense of security by pretending to be someone she knows. Make sure your child understands that it isn’t a good idea to give out personal information such as their address, phone number or the name of their school. The less information a potential predator has, the harder it will be for him to actually locate a victim. It might be a good idea to establish a secret password and share it only with friends and family so your kid has a way to identify people who are safe to chat with.

Chat interfaces and instant messaging are great tools for keeping in touch with friends and conduct business, but they are also direct connections between your child and possible pedophiles and other predators. Most instant messengers have settings that will only allow people on a pre-approved list to approach your child. That way you can let the kids chat with family and friends while keeping the bad guys out. You can visit http://www.internet-parental-control.org to find more information on online child safety measures.

You can’t watch your kids every minute they are online, and you can’t always count on them to do what you have taught them to do. Parental control software is a great back up. Most browsers will allow you to customize age-appropriate settings for each child in your house. You can choose what kinds of Web sites you want your kids to access and block them out of the ones you don’t. It’s a great way to provide a virtual safety net for your family. If the parental controls supplied by your Internet Service Provider, check into installing additional software that will evaluate each site your child attempts to access. You set criteria by which the software judges each Web page and assigns a rating, much like a movie rating. Your kids will only be able to look at sites with ratings you have deemed appropriate.

How to Set Internet Safety Rules for Children

How to Set Internet Safety Rules for Children
By [http://ezinearticles.com/?expert=Ambrose_Duperon]Ambrose Duperon

The Internet is an amazing tool, available for use by millions upon millions of people every day. Unfortunately, just as many people abuse the resources provided by the Internet. Sexual predators abuse the Internet by using it to stalk vulnerable people, collect personal information, and plan their attacks.

Before giving your child Internet access, have a conversation about both the benefits and dangers of the Internet. Lay down a set of rules to govern your child's use of the Internet and be sure to include the following:



Explain the importance of keeping personal information private. Explain what personal information is. Younger children may not understand that addresses, phone numbers, school names, and parent names should be kept private.

Keep the computer in a public place. Children should not need privacy while using the Internet to network with friends or while they are doing homework. You should be able to monitor your child's Internet use at all times.

Make sure your child knows that he can, and should, let you know if anyone he meets on the Internet makes him feel uncomfortable. The same applies to information or websites that your child may accidentally access; knowing how your child gained access to dangerous information will help you to prevent a similar occurrence in the future.

Prohibit the exchange of personal photographs, especially with strangers. A picture, combined with any other personal information that may have been obtained, will increase a sexual predator's chances of locating and harming your child.

Make sure you have access to your child's user ID's and passwords, not only for e-mail accounts but also for any website that requires the input of personal information for private access.

Prohibit your child from meeting online friends in person. If there is a reason for you to allow a personal meeting, it should be in a public place and in your presence. Your child should not gain the impression that it is ok to meet other people without your permission.

While teaching your child about the dangers of others on the Internet, be sure to explain that your child should not abuse the system or hurt others. Incidents of children bullying classmates on the Internet have increased as well, and your child should contribute to keeping the Internet a safe place for his friends as well as himself.

Once the rules are set, make sure they are strictly enforced. Not backing down will let your child know you are serious about his safety. It may not seem like it at the time, but your child will thank you later in life.

Ambrose Duperon http://www.onlinepredators.info

If your children use the Internet you must read our free report - " [http://www.onlinepredators.info/OnlinePredatorsFAQ.pdf]Online Predators"

Article Source: http://EzineArticles.com/?expert=Ambrose_Duperon http://EzineArticles.com/?How-to-Set-Internet-Safety-Rules-for-Children&id=846789

Friday, January 11, 2008

What is Online Data Backup?

What is Online Data Backup? by Lee Morrell

Online backup also known as offsite backup or remote backup is a data protection service that enables your company to restore data regardless of disaster. Your business will have important data, which you cannot afford to lose stored on PC’s, laptops or Servers. This can be file data such as word, excel documents or application data such as mail, financial or customer management data. If your company cannot survive without this data then you need an online data backup service. It will automatically carry out the complicated function of backing up your data to a remote storage environment whilst offering a simple method of management.

The concept of offsite or online backup is simple; you data residing on your server which you need to protect. The easiest and most secure way of doing this is to ensure your backup is move to a location as far away as possible from your offices. This will ensure, regardless of disaster, fire, flood, structural damage you can rest assured you will never lose any of your important data.

The backup process is automatically carried out when your business is closed, i.e. every night, but this can be controlled by you, you can backup every hour if required. All companies must make an initial full backup then thereafter will make nightly incremental backups. An incremental backup will only backup the changes made within your days trading. This is a great way of cutting down your backup windows and also offers multiple versions of your backup.

The concept of online backup is simple and the offerings from most companies are pretty much the same, so why can the monthly charges vary so much between suppliers? Easy answer, its all about software functionality and hardware infrastructure, for example, I could setup a company right now that offers remote backup solutions. I could buy a software license for a half decent backup product and build a cheap server and run it from my home. Not the best way to treat your company’s most important asset.

Backup Software – should be feature rich, offering volume shadow copy, in-file delta, one click restore processes as standard. All agents to backup all major applications such as Microsoft Exchange, Microsoft SQL , Lotus, Oracle, MySQL should be included as standard. Support for all major operating system should also be included. Finally and the most important element to any online backup solution is security. Your data must be encrypted to levels used by the military prior to being transmitted via the internet.

Hardware Infrastructure – should be totally resilient to failure, for example, the environment you are sending your data to must consist of many servers and many storage environments, so if one part of the system failed other parts will automatically pick up the work load. This infrastructure should also be replicated to a data storage facility preferably in a different country; all ensuring loss of your data is an impossibility.

In summary, companies who charge £10 per month will probably have a server in their home; companies who charge £50 per month will probably have £500,000 of hardware located throughout Europe.

For more information how
online backup can
help your business, please visit
www.perfectbackup.co.uk

Article Source: http://www.article-buzz.com

Data Protection Guide

Protecting Your Valuable Data


Set up your computer in a safe environment. Your computer should be in a dry, cool, controlled environment that is clean and dust-free. Placing your computer in a low-traffic area will protect your system and storage media from harmful jarring or bumping.

Backup your data regularly. Creating regular backups is one of the most effective ways to protect yourself from losing data. Back up data at least once a week with reliable tapes or other storage devices, always verifying that the correct data is backed up.

Use an uninterruptible power supply (UPS). In the event of a surge of electricity or lightning strike, an uninterruptible power supply protects your computer from being fried. In addition, a UPS has a battery backup that keeps your computer running for a short time in the event of a power outage, giving you time to save your work and avoid potential data loss. If UPS is not an available or economical solution, a surge protector is also a good investment.

Run a virus scan regularly and update it four times a year. Computer viruses are one of the worst enemies to your computer. Good anti-virus software tests your system for sequences of code unique to each known computer virus and eliminates the infecting invader.

Be aware of strange noises. If you hear a strange noise or grinding sound, turn off your computer immediately and call an expert. Further operation may damage your hard drive beyond repair.

If you do experience a data loss, Ontrack Data Recovery can help. Even the best maintenance program cannot always prevent system crashes or data loss. Ontrack Data Recovery offers a wide array of data recovery solutions, ranging from in-lab data recovery services and remote data recovery services to cost-effective do-it-yourself data recovery software. Please visit our Data Recovery Service Center to learn more about these solutions, and to find out which is best for your particular situation.

Caring for Your Hard Disk Drive

Despite the obvious importance of this equipment to your system, many users neglect to care for their hard disk drive. Your drive is easily susceptible to many sources of damage. Ontrack offers the following tips to protect and care for your hard drive:

Protect your drive from excessive jarring and bumping. All too often, when people install, move or reconfigure hard disk drives, they knock the drive around unintentionally, damaging equipment that can result in the loss of data.

Beware of static. Static electricity, an unseen and unfelt enemy, can wreak havoc on the wiring inside computer chips and transistors. Because it's so easy to discharge built-up static when you touch a hard disk drive, precautions like wrist straps can help prevent static discharge.
Acclimatize the room in which you store your equipment. Be careful of temperature, humidity, altitude and vibration, all forces that could lead to the intermittent or total failure of hard drives.

Perform periodic checks of your hard disk drive. Ontrack Data AdvisorTMsoftware can run tests of your system, warning you of impending problems.
Place your hardware in a safe location. When you move your computer to a new position (from your desk to the floor, or from a horizontal to a vertical position), you should always backup the hard disk drive. An accidental bump to the drive could cause the heads to track differently, resulting in disk read or write errors.

Caring for Your Tapes

Tapes are delicate storage media, but like the hard drive, people often fail to give their tapes adequate attention and protection.

Keep your tapes boxed until you need to use them. Opening tape boxes prematurely will unnecessarily increase a new tape's exposure to dust, moisture and sunlight, and could eventually erode a tape's quality and dependability.

Do not attempt to load a tape into the drive if you notice dents, cracks or moisture in the tape's cases, hinged doors or file-protect selectors. Loading a damaged tape could not only lead to further tape damage, but compromise the integrity of your system.

Store your tapes in proper fashion. It's important to store your tapes in their original cases, and standing upright. This helps prevent uneven winding of the tape and protects them from potentially damaging environmental elements.

Beware of temperature extremes. Store your tapes at room temperature. Excessive heat can cause the plastic used in tapes to constrict, causing instant destruction and unpredictable read/write errors in your tapes.

Avoid magnetic fields. Speakers, microwave ovens and printer heads can destroy your tapes and erase all information stored on them.

Tips for Successful Data Recovery

No matter how hard you try to protect your data, your system may still fall victim to data loss. Regardless of the cause of your data loss, there are steps you can take to keep your data loss from becoming a data disaster.

Don't panic. You should never assume your lost data is unrecoverable. Simply call a qualified data recovery expert or visit the Ontrack Data Recovery Service Center. In most cases, your data is fully recoverable.

Do not use file recovery software if you suspect an electrical or mechanical failure. Using file recovery software on a faulty hard drive may destroy what was otherwise recoverable data.

"Undelete" tools can save your data from human error. Most disk utility packages contain a function that allows you to retrieve an erased file. This tool must be used immediately; however, because your computer will quickly write new data over the deleted file. For users of Windows 95 and Windows NT (v4.0) deleted files are stored in the recycle bin and can be restored as long as you haven't emptied the recycle bin. For more information see your Windows documentation or help file.

Do not clean or operate equipment damaged by a natural disaster. Whether it's a flash flood or a twisting tornado, if you've fallen victim to a natural disaster, there is only one option. Call Ontrack immediately. Your chances for a successful recovery are greatly reduced if you make any attempts yourself to clean or dry your damaged computer.

If your data falls victim to computer crime, seek professional advice. Computer crime is a growing threat to data. Intentional data deletion, duplication and theft increasingly threaten valuable computer information. If you need to uncover criminal computer activity or need electronic evidence to make your case, contact Kroll Ontrack immediately. Ontrack Electronic Information Management specialize in computer theft diagnosis and electronic evidence gathering.

Create and maintain reliable backups. It's surprising how few people actually backup their systems regularly, and verify that backups are complete and error free.

If you have suffered data loss, seek professional help. Professional data recovery services offer the expertise and tools required to recover your data quickly and efficiently.

Ontack Data Recovery specialised in data recovery services and software in the United Kingdom.

Article Source: http://EzineArticles.com/?expert=Etienne_Clergue http://EzineArticles.com/?Data-Protection-Guide&id=897733

Top Ten Tools to Boost your Internet Security

Top Ten Tools to Boost your Internet Security
by Morgan Stevens

Morgan Stevens is a Lbry.com contributor

When you think of security, you likely think of banks and financial institutions or at least companies in fear of corporate espionage. However, they are not the only ones that have to be concerned about security. Anyone, including you, who use a home or business computer to get on the Internet, should be concerned about security. The Internet, though a wonderful tool full of a wealth of information, is also full of hackers, viruses, and scams that can cause your computer a great deal of damage. Luckily, though, there are ways to prevent a lot of what is out there from getting to your computer. There are tools and precautions that you can take. The following is a list of ten that can go a long way in protecting you and increasing your internet security.

First of all, make sure that you have a firewall up and running on your computer as well as a virus checker. In addition, make sure what any firewall that is included in the software that you are using, has been activated and is set properly. Consult a professional or your software's manual to make sure you have the settings on and correctly placed for optimum security.

Secondly, try not to share your computer. Obviously there are times when, especially with a home computer, that there will be multiple users. The problem is more with kids than anything. Children have a tendency to download anything they think they might need without thinking it through. If you do have to share your computer with kids, take time to talk to them about the risks of downloading software.

Third, backup your data often, especially what you consider it to be essential data. If you don't do anything else, make sure you backup your files. Anything you backup can then be recovered if there is a problem. Viruses, worms, and the like can eat up files and before you know it family photos, business files, or important contact information can be lost.

Fourth, make sure you know what you are doing before you download anything. Don't click on any unknown link that asks you to agree to install software to view their page. Sometimes those contain spyware.

A fifth way to increase security is to be careful about any business or sensitive information you access or pass on a public computer. You don't know what has been downloaded onto those computers that could affect your files. Public computers could even have spyware, key logger software, or a number of other programs that could steal sensitive information. If you must use a public computer, make sure you sweep it with some sort of spyware seeking program.

Sixth, carry a flash card with you. You can load your own software onto the flash card for use on any public computer. This will help keep you safe from viruses. You may also want to use it if you are going to be using a family computer that you are not positive its virus and spyware free.

Seventh, be aware of visiting porn sites or anything like that. Those sites often contain spyware, Trojan horses, viruses, and all sorts of nasty bugs. These sites can even have software that will run in the background of your computer without even asking you, so it is a good idea to just stay away.

Eighth is to never open SPAM email while you are still connected to the Internet. Many times, just by clicking on SPAM you may be adding yourself to your own email address to another SPAM list. Often your email address will be used as the sending address and you will not know this, until your account is closed. You can also end up with viruses or worms from clicking on SPAM emails.

Ninth, turn off your Internet connection if you are determined to see what is in the SPAM email. If you really feel like you need to look at a message, but are unsure where it is from, shut off your connection at the firewall first.

Lastly, make sure you have a good virus checker installed on your computer. There are many free ones out there and they work well at protecting your computer from viruses. Look for a virus checker with regular updates, even daily updating, and make sure you receive those updates. Virus software will help protect your computer from new viruses and new worms that get discovered on the Internet. By making sure you are updating regularly, you can keep up with the viruses that are out there.