![]() These activities can result in multiple copies of a document existing in different places. A common personal document management activity is copying files from one device to another – from a work computer to a USB drive to a home computer for instance. With duplicates, two or more files contain the exact same document content. This is always deliberate.ĭuplicate files. While there is still only a single conceptual document it resides in two files. A document might be saved in both Microsoft Word and PDF file formats. A large document (such as a book or long report) might be split into one file per chapter. There are four different ways this can happen: It is rare for a single file to contain more than one document, but reasonably common for a document to reside in more than one file. Files may contain documents, but they are also used to store data that is not a document, such as application files and configuration files. A file is a logically connected chunk of data that exists on a storage medium. A document is a single conceptual entity, with an integrated form and purpose and a life history from creation, through editing to finally deletion or dormancy. We explore the systems that users have developed to work around this deficiency and suggest some guidelines for the design of more effective document management systems.ĭocuments and files are not synonymous. the file systems built into modern Operating Systems) do not provide adequate support for managing file duplication. We find that current personal document management systems (i.e. This study of 73 knowledge workers combines a snapshot of their file system with a questionnaire about their document management practices in order to understand their document management structures, strategies and struggles. Users must spend time and effort to consciously and manually manage this duplication, or they run the risk of losing or overwriting data. copying a file to a USB drive and then back to a different location). creating different versions of a document to preserve a history) or inadvertently (e.g. Duplicates can be created deliberately (e.g. I would prefer to use variables as much as possibile, since it seems to cover much broader situations.In personal document management, a common problem for users is handling file and folder duplication. **\Documents and Settings\*\Application Data\.C:\Documents and Settings\*\Application Data\.C:\Documents and Settings\%username%\Application Data\.Sometimes it's located here, but the user name always changes C:\Documents and Settings\\Application Data. ![]() ![]() What's the best practice for excluding locations such as the %APPDATA%? **\*\SharePoint Portal Server\ - Not a good idea since it seems rather general but still valid, correct? **\%PROGRAMFILES(X86)%\SharePoint Portal Server\ - If running a 64 BIT system (just an example, didn't verify actual file locations) **\*\Common Files\Microsoft Shared\Web Storage System\.C:\%Program Files%\Common Files\Microsoft Shared\Web Storage System\.**\Program Files\Common Files\Microsoft Shared\Web Storage System\.**\Program Files\SharePoint Portal Server\.Written in McAfee "exception language" could like this, correct? Drive:\Program Files\Common Files\Microsoft Shared\Web Storage System.Drive:\Program Files\SharePoint Portal Server.I just wanted to confirm something on this thread before I finalize my exclusions document.Įxample taken from the following Microsoft KB What we need is a 'Zen and the Art of VirusScan Exclusions' Getting the right exclusions unfortunately is not as simple as just bunging in a number of documented defaults, it requires skill and understanding of your environment and especially your acceptable risks levels and these will be different from installation to installation or even vary within an organisation depending on exposure of the systems. If we excluded everyting Microsoft, Symantec, BMC, Altiris etc documented there wouldn't be many folders left being scanned any longer. Instead you have to duplicate the global policy and add the extra processes and in doing so you lose the connection to the global exclusions so you now have to remember to edit all local policies whenever the global policy changes.įinally, I also find that software vendors invariable specify exclusions too generously. sqlservr.exe, store.exe, etc) and not have to worry about where the application is writing its files to.įor me the biggest issue with exclusion policy management is that policies aren't cumulative, you can't for instance exclude a set of processes on a global level and then add a couple of additional process exclusions for a specific set of servers. This way I can specify the two or three processes that are responsible for the file I/O (e.g. I find that excluding by process instead of by folder (or drive letter as in the case with MSCS) is more efficient for my purposes.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |