not mobile

Information Technology Handbook

Section 5 Introduction

Definitions

The following definitions of shall, will, must, may, may not, and should are used throughout this Handbook.

  1. Shall, Will, and Must indicate a legal, regulatory, standard, or policy requirement. Shall and Will are used for persons and organizations, and Must for inanimate objects.
  2. May indicates an option.
  3. May Not indicates a prohibition.
  4. Should indicates a recommendation that, in the absence of an alternative providing equal or better protection from risk, is an acceptable approach to achieve a requirement. The focus of should statements generally is more outcome-based; i.e., an alternate method to achieve the requirement may be developed assuming it is documented as effectively managing risk.

The following definitions of Authentication, Availability, Confidentiality, Computer Security Incident, DNS, DNS Spoofing, Domain, Endpoints, Endpoint Security, Endpoint Security Management, Endpoint Security Management System, Event of Interest, Guideline, Incident Management, Incident Response Management, Integrity, Metric, Monitoring, Performance Goal, Performance Measures, Policy, Split DNS, and Standard are used throughout this section.

  1. Authentication is a process of attempting to verify the digital identity of a system user or processes.
  2. Availability: Ensuring timely and reliable access to and use of information.
  3. Confidentiality: Preserving authorized restrictions on information access and disclosure, including means for protecting personal privacy and proprietary information.
  4. Computer Security Incident is a violation (breach) or imminent threat of violation of computer security policies, acceptable use policies, or standard computer security practices, which may include, but are not limited to:
    • Widespread infections from virus, worms, Trojan horse or other malicious code;
    • Unauthorized use of computer accounts and computer systems;
    • Unauthorized, intentional or inadvertent disclosure or modification of sensitive/critical data or infrastructure;
    • Intentional disruption of critical system functionality;
    • Intentional or inadvertent penetration of firewall;
    • Compromise of any server, including Web server defacement or database server;
    • Exploitation of other weaknesses, known or unknown;
    • Child pornography;
    • Attempts to obtain information to commit fraud or otherwise prevent critical operations or cause danger to state or system or national security and
    • Violations of state or USG security policies or standards that threaten or compromise the security objectives of state or USG data, technology, or communications systems; and,
    • Any violation of the “Appropriate Use Policy.”
  5. DNS refers to the domain name system, which represents a powerful Internet technology for converting domain names to their corresponding IP addresses.
  6. DNS Spoofing refers to confusing a DNS server into giving out bad information. The way it works is that an attacker sends a recursive query to the victim’s server, using the victim’s server to resolve the query. The answer to the query is in a zone the attacker controls. The answer given by the attacker’s name server includes an authoritative record for a domain name controlled by a third party. That authoritative record is FALSE. The victim’s server caches the bogus record. Once spoofed, the victim’s resolver will continue to use the false record it has in its cache, potentially misdirecting email, or any other Internet service. This is a potential major security leak for credit card information, trade secrets, and other highly sensitive information.
    • Note: Most modern servers will not cache a fake record because it does not fall in the same parent zone as the record that was requested.
  7. Domain is most often used to refer to a domain zone, it is also used to describe a zone or a domain name.
  8. Endpoints can include, but are not limited to, PCs, laptops, smart phones, tablets and specialized equipment such as bar code readers or point of sale (POS) terminals.
  9. Endpoint Security is an approach to network protection that requires each computing device on a corporate network to comply with certain standards before network access is granted. Simple forms of endpoint security include personal firewalls or anti-virus software that is distributed and then monitored and updated from a server.
  10. Endpoint Security Management is a policy-based approach to network security that requires endpoint devices to comply with specific criteria before they are granted access to network resources.
  11. Endpoint Security Management Systems, which can be purchased as software or as a dedicated appliance, discover, manage, and control computing devices that request access to the corporate network. Endpoints that do not comply with policy can be controlled by the system to varying degrees. For example, the system may remove local administrative rights or restrict Internet browsing capabilities.
  12. Event of Interest is a questionable or suspicious activity that could threaten the security objectives for critical or sensitive data or infrastructure. They may or may not have criminal implications.
  13. Guideline: A guideline is a document that suggests a path or guidance on how to achieve or reach compliance with a policy.
  14. Incident Management is the process of detecting, mitigating, and analyzing threats or violations of security policies and controls and limiting their effect.
  15. Incident Response Management is the process of detecting, mitigating, and analyzing threats or violations of security policies and limiting their effect.
  16. Integrity: Guarding against improper information modification or destruction, and includes ensuring information non-repudiation and authenticity.
  17. Metric is a numeric indicator(s) used to gauge system-wide program performance and monitor progress toward accomplishing system-wide goals and objectives. Monitors and measures accomplishment of goals by quantifying the level of implementation and effectiveness.
  18. Monitoring is observing and checking for a set standard or configuration.
  19. Performance Goal is the desired result(s) of implementing the security objective or technique that are measured by the metric.
  20. Performance Measures are the actions required to accomplish the performance goal validated through the completion and analysis of the institution report.
  21. Policy: A policy is typically a concise document that outlines specific requirements, business rules, or company stance that must be met. The policy is the organization’s stance on an issue, program, or system. It is a rule that everyone must meet.
  22. Split DNS is when internal hosts are directed to an internal domain name server for name resolution, while external hosts are directed to an external domain name server for name resolution.
  23. Standard: A standard is a requirement that supports a policy.

Implementation and Compliance

Section Number Section Name Compilation Date Published Date Compliance Date Revision Date(s)
5.1 USG Information Security Program February 2009 February 2009 to InfoSec
February 2013 to IT Handbook
February 2009 May 2014
5.2 Information Security Organization and Administration February 2009 February 2009 to InfoSec
February 2013 to IT Handbook
February 2009 May 2014
5.3 Incident Management December 2008 December 2008 to InfoSec
February 2013 to IT Handbook
February 2009 May 2014
5.4 USG Information Asset Management and Protection Standard July 2013 May 2014 TBD
5.5 IT/IS Risk Management April 2010 April 2010 to InfoSec
February 2013 to IT Handbook
April 2010 May 2014
5.6 USG Information System Categorization and Data Classification Standard June 2013 May 2014 July 2014
5.7 USG Classification of Information Standard June 2013 May 2014 July 2015
5.8 USG Endpoint Security Standard June 2013 May 2014 July 2015
5.9 Security Awareness, Training, and Education April 2009 April 2009 to InfoSec
February 2013 to IT Handbook
April 2009 May 2014
5.10 Required Reporting April 2009 April 2009 to InfoSec
February 2013 to IT Handbook
April 2009 May 2014
5.11 Minimum Security Standards for USG Networked Devices October 2008 October 2008 to InfoSec
May 2014 to IT Handbook
October 2008 May 2014
5.12 Password Security July 2010 July 2010 to InfoSec
February 2013 to IT Handbook
July 2010 May 2014
5.13 Domain Name Service February 2011 February 2011 to InfoSec
February 2013 to IT Handbook
February 2011 May 2014
5.14 Copyright Violation Guideline April 2010 April 2010 to InfoSec
May 2014 to IT Handbook
April 2010 May 2014
5.15 Identity Theft Prevention Standard – Red Flags Rule January 2011 January 2011 to InfoSec
May 2014 to IT Handbook
January 2011 May 2014
5.16 Email Use and Protection Standard January 2009 January 2009 to InfoSec
May 2014 to IT Handbook
May 2014

Print friendly Version date May 16, 2014

Introduction

Information and information systems are strategic assets to all University System of Georgia (USG) entities. The Board of Regents (BoR) recognizes that information created, collected, or distributed using technology by a USG institution, the University System Office (USO) [which includes the Shared Services Center (SSC)], the Georgia Public Library System (GPLS), and the Georgia Archives is a valuable asset and must be protected from unauthorized disclosure, modification, or destruction. The degree of protection needed is based on the nature of the resource and its intended use. Each USG institution, the USO, the GPLS, and the Georgia Archives have the responsibility to employ prudent information security standards and best practices to minimize the risk and threats to the integrity, confidentiality, and availability of USG information and information systems.

Information security means the protection of information and information systems, equipment, and people from a wide spectrum of risks and threats. Implementing appropriate security measures and controls to provide for the confidentiality, integrity, and availability of information, regardless of its form (electronic, print, or other media) is critical to ensure business continuity and protection against unauthorized access, use, disclosure, disruption, modification, or destruction.

It is USG policy to provide an environment that encourages the free exchange of ideas and sharing of information. Access to this environment and the USG’s information technology (IT) resources is a privilege and must be treated with the highest of ethical standards.

Applicability

All USG institutions, the USO, the GPLS, and the Georgia Archives must comply with the information security and privacy policies, standards, and procedures issued by USG Information Security & ePrivacy, and report and file the appropriate compliance documents as identified in this policy. All USG institutions, the USO, the GPLS, and the Georgia Archives must adhere to the Information Security Reporting Requirements, as noted in Section 5.10 of this Handbook.

The scope of this section is to have broad application, particularly with respect to information and information systems, which impact the operational levels of the USG institutions, the USO, the GPLS, and the Georgia Archives. In a similar manner, all contractual agreements with 3rd party vendors must adhere to the guidance provided. An appropriate Service Level Agreement (SLA) and Non-Disclosure Agreement (NDA) should be constructed to ensure roles and requirements are acknowledged and followed.

Return to Top