Access – The ability or the means necessary to read, write, modify or communicate data/information or otherwise use any system resource.
Access Control – The process that limits and controls access to resources of a computer system; a logical or physical control designed to protect against unauthorized entry or use.
Access Control Mechanisms – Hardware, software, or firmware features and operating and management procedures in various combinations designed to permit authorized, and detect and prevent unauthorized access to a computer system.
Access Rights – Also called “permissions” or “privileges”, these are the rights granted to users. Access rights determine the actions users have been authorized to perform (e.g., read, write, execute, create and delete).
Application – A computer program or set of programs that processes records for a specific function.
Authentication – The corroboration that a person is the one claimed. Authentication is the act of verifying the identity of a user and the user’s eligibility to access computerized information. It also can refer to the verification of the correctness of a piece of data.
Availability– Data or information that is accessible and useable upon demand by an authorized person.
Backup – Exact copies of files and data, and the necessary equipment and procedures available for use in the event of a failure of applications or loss of data, if the originals are destroyed or systems are not functioning.
Business Continuity Plan – Also known as contingency plan. A document describing how an organization responds to an event to ensure critical business functions continue without unacceptable delay or change.
Business Continuity Planning – Business continuity is the ability to maintain the constant availability of critical systems, applications, and information across the enterprise.
Confidentiality-The status accorded to data or information indicating that it is sensitive, and therefore needs to be protected against theft or improper use and is not made available or disclosed to unauthorized persons or processes.
Contingency Plan – a plan which is devised prior to an emergency. A contingency plan is intended to offset recovery time after a crisis.
Common Vulnerability Scoring System (CVSS) – a method that assesses the severity of a computer system security vulnerabilities. CVSS implements a scoring system in which vulnerabilities can be compared and prioritized.
Confidential Data – University data that cannot be released or is protected by federal or state law. Data may also be confidential if it is protected under a contractual agreement requiring confidentiality.
Data Administrator – An individual or group of individuals who are responsible for maintenance data resources.
Data Owners – Individuals employed by the university who have been given the responsibility for the integrity, accurate reporting, and use of computerized data.
Disaster Recovery Plan – Disaster recovery refers to the immediate and temporary restoration of critical computing and network operations after a natural or man-made disaster within defined time frames. The Disaster Recovery Plan documents how the university will respond to a disaster and resume the critical business functions within a predetermined period of time; minimize the amount of loss; and repair, or replace, the primary facility to resume data processing support.
Electronic Media – Electronic storage media includes memory devices in computers (hard drives) and any removable/transportable digital memory medium, such as magnetic tape or disk, optical disk, or digital memory card. See also Information Technology Resources.
Encryption – A technique (algorithmic process) used to transform plain intelligible text by coding the data so it is unintelligible to the reader.
Firewall– A dedicated device equipped with safeguards that acts as a single, more easily defined Internet connection.
Guidelines – a list of recommendations or suggestions to aid in organizing or delivering services. Guidelines differ from regulations or protocol because they are not mandatory.
Information Security – Administrative, physical and technical controls that seek to maintain confidentiality, integrity, and availability of information.
Information Technology (IT) Resources – IT resources are tools that allow access to electronic technological devices, or are an electronic technological device themselves that service information, access information or is the information itself stored electronically. These resources include all computers and servers; desktop workstations, laptop computers, hand held computing and tracking devices; cellular and office phones; network devices such as data, voice and wireless networks, routers, switches, hubs; peripheral devices such as printers, scanners and cameras; pagers, radios, voice messaging, computer generated facsimile transmissions, copy machines, electronic communication including email and archived messages; electronic and removable media including CD-ROMs, tape, floppy and hard disks; external network access such as the Internet; software, including packaged and internally developed systems and applications; and all information and data stored on universities equipment as well as any other equipment or communications that are considered IT resources by the university.
Information Security Office– The unit responsible for overall information security functions for the university. Information security functions include policy administration, security audits and assessments, security tools, security operations, security investigations, security awareness training, and risk management pertaining to the potential loss or unauthorized disclosure of IT resources and electronic information.
Integrity – Relevant to computer and system security, a security principle that keeps information from being modified or otherwise corrupted either maliciously or accidentally. Data integrity refers to the accuracy and completeness of the data.
Logging – The process of electronically recording activities of IT resources.
Malware – Software intended to access a computer, without consent from the user, to cause harm, interruption, or data loss. Types of malware include trojans, viruses, and worms.
Mitigation – Any action taken to eliminate or reduce the risk from hazards to human life, property, and function.
Need to Know Principle– A security principle stating that a user should have access only to the data
he or she needs to perform a particular function.
Password – A protected, generally computer-encrypted string of characters that authenticate an IT resource user to the IT resource.
Physical Security– The protection of physical computer systems, related buildings, equipment from fire, and other natural and environmental hazards, as well as from intrusion. Also covers the use of locks, keys, and administrative measures used to control access to computer systems and facilities. The measures used to provide physical protection of resources against deliberate and accidental threats.
Preventive Controls – Controls designed to prevent or restrict an error, omission, or unauthorized access to IT resources.
Procedures – a predetermined series of steps that must be executed sequentially in order to achieve a consistent result.
Protected Data – University data that is not identified as Confidential or Public data that must be appropriately protected to ensure a lawful or controlled release (e.g. Connecticut Freedom of Information Act requests).
Public Data – Data that is open to all users, with no security measures necessary.
Risk -The aggregate effect of the likelihood of occurrence of a particular threat with the degree of vulnerability to that threat and the potential consequences of the impact to the organization if the threat did occur.
Risk Analysis – An assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of IT resources.
Risk Management – The process of identifying, measuring, controlling and minimizing or eliminating security risks that may negatively affect University data or information systems.
Safeguards – (also called security controls). The protective measure and controls that are prescribed to meet the security requirements specified for systems. Safeguards may include, but are not limited to: hardware and software security features; operating procedures; accountability procedures; access and distribution controls; management constraints, ;personnel security; and physical structures, areas, and devices.
Security Incident – The attempted or successful unauthorized access, use, disclosure, modification, or destruction of information or interference with systems operations in an information system.
Security or Security Measures – Encompass all of the administrative, physical, and technical safeguards in an information system.
Security Policies-The framework with which an organization establishes needed levels of information security to achieve the desired confidentiality goals. A policy is a statement of information values, protection responsibilities, and organization commitment for protecting University IT resources.
Standard – a basis for comparison, or a point of reference in which other things may be evaluated against.
System Administrator – A person assigned responsibility for managing, maintaining, and ensuring the availability and integrity an information system.
Threat – An action or event that posses a possible danger to a computer system. The potential for exploitation of a vulnerability.
Unique User Identifier – A unique set of characters assigned to an individual for the purpose of identifying user identity.
Vulnerability – An exploitable weakness in a system used to violate a system’s intended behavior.