The Role of Big Data in Case Management and Security

The Role of Big Data in Case Management and Security

Guest post written by Astrid Sutton

In this digital era, technology is constantly evolving. Workplaces have gone from pen-and-paper case management processes to advanced computer-based systems. Today, the market for case management software continues to grow. Findings by Reportlinker show that it’s expected to grow to $7.6 billion by the end of 2025. And recent technological innovations have only enhanced it further. In particular, big data systems are expanding the ways case management software can expedite tasks and processes.

case management and big data

How Does Big Data Assist in Case Management?

As the name implies, big data refers to huge amounts of data — not just gigabytes and terabytes, but petabytes (1,024 terabytes) or even exabytes (1,024 petabytes). With advanced data collection and management systems, organizations can gather vast quantities of data and synthesize them into valuable insights. These insights can then be used to make better decisions for every stage of the case management process.

For instance, case management in the healthcare sector involves the collaborative process of assessing patients, planning and implementing treatments, then facilitating them afterward. Because healthcare is taking on a more patient-centered approach, it will require more data to inform its decisions. Big data systems, then, can assist in the case management of individual patients by storing and analyzing their unique medical background, their needs, and their preferences. Generalizing the gathered data into reports will grant medical professionals valuable knowledge on how to provide the best type of care to patients.

However, with a growing reliance on valuable data comes the need to secure it. The rapid promulgation of cybersecurity systems comes as no surprise. With about 4.1 million data records compromised in the first six months of 2019, organizations across the country scrambled to put up more robust data protection measures. Luckily, big data has a part to play in cybersecurity too.

how does big data assist in data security

How Does Big Data Assist in Data Security?

Security big data analytics is a tool that organizations can use to gather and analyze data about internal systems. It makes use of artificial intelligence, particularly machine learning and deep learning capabilities, to process the data and pinpoint patterns that could indicate a cyber threat. Because it’s able to pick up on patterns, it can guard against a myriad of cyber threats, such as data theft, ransomware, and attacks on the server. Security big data analytics also has a greater capacity compared to traditional cybersecurity software, making it a robust data security option.

The need for heightened security measures is also changing the way data analytics is being taught. This is why those who have a degree in data analytics will not only be trained in areas such as forecasting and predictive modeling, but also key data principles when it comes to security. Today data security is just as important as data collection, which is why security is a key part of any education and learning related to big data.

Data is the life blood of business. According to the US Bureau of Labor Statistics, the analytics technology and solutions market have a compound annual growth rate of 13.2% through 2022. And with more industries investing in big data solutions, the job outlook for data professionals is ideal. Among these industries are that of banking, manufacturing, and the governmental sector — all of which make use of meticulous case management processes.

Case management on its own already provides multiple benefits, from improved manageability to better insights. But paired with big data, this process is set to expand its reach, and produce even more data-driven insights.

Guest post written by Astrid Sutton

STARS III Press Release

STARS III Press Release

GSA Awards Armedia a Spot on $50B STARS III Contract

VIENNA, Va, June 29, 2021—Armedia, through its Hartwood–Armedia Joint Venture with Hartwood Consulting Group, was selected by the General Services Administration (GSA) for a spot on its $50 billion STARS III contract. The governmentwide acquisition contract (GWAC) is a best-in-class, easy-to-use contract that allows agencies to procure IT from pre-vetted prime contractors. The STARS III contract is expected to focus on emerging technologies and technologies outside of the continental United States, in addition to a full range of Information Technology services.

“We are excited to partner with our protégé, Hartwood Consulting Group, on the STARS III GWAC. This gives us another Best in Class vehicle on which we can service our current and future clients, while continuing to strengthen the relationship with our Protege. The combination of Armedia and Hartwood offers a unique mix of technical excellence and responsive PMO support geared toward both Federal Civilian and DoD agencies,” stated Jan Coley, Vice President of Business Development at Armedia.

Hartwood-Armedia Joint Venture (HAJV)

Hartwood-Armedia Joint Venture (HAJV), an SBA approved 8(a) and SDVOSB JV combines two companies that are complementary in capabilities, culture and vision. Our federal clients require a partner who understands the technological landscape and has proven past performance supporting application sustainment and platform operations and maintenance within that environment. Our mission of supporting our clients in transforming their operations through management of data, process improvement and seamless integration through the use of innovation has been proven to be successful with the USDA, Department of Defense, US Senate, Internal Revenue Service, as well as the US Marine Corps. Learn more at www.h-ajvll.com.

hartwood armedia joint venture

About Hartwood

Hartwood Consulting Group, a VA-certified Service-Disabled Veteran-Owned Small Business (SDVOSB) and an 8(a) firm, helps government organizational experts understand their services and systems better. Hartwood offers an accomplished range of professional and technical capabilities in collaborative support with the government. This includes all aspects of system augmentation, configuration, assessment and support for the RD applications and databases. Hartwood personnel have combined to provide 100+ “rapid application development” solutions, incorporating modified approaches of Agile, spiral and waterfall methodology for the USMC HQMC. Learn more at http://www.hartwoodcg.com/.

hartwood consulting group

About Armedia

Armedia, LLC is a Veteran Owned Small Business and a long-term solution provider to the US Federal Government. Founded in 2002, Armedia is a technology advisory firm, focused on assisting organizations build the systems and the cultures necessary to manage and harness their data. We provide the expertise in process improvement and how to support that improvement through quality solutions and effective change management. In addition, we also provide expertise and services in content intelligence, big data analytics, Agile software development, and open source and mobile technologies. Appraised at CMMI Dev Level 3 and ISO 9001:2015 certified, Armedia brings rigorous process discipline to all projects. In addition, Armedia holds a TS facilities clearance.

armedia logo

Future-proofing organizations with open source software solutions

Future-proofing organizations with open source software solutions

open source software solution

Procurement practices using vendor lock-in technologies may thwart the ability to change with mission objectives and cost-efficiency of IT modernization initiatives, resulting in slowing the government agencies’ ability to face new challenges. Fueled by misconceptions, open-source solutions used to be overlooked as viable alternatives to proprietary software within large enterprises. With the successes of Linux, Java, Apache, ArkCase, Drupal, WordPress, Alfresco, Nuxeo, and many others, open-source technologies have proven to be secure, cost-effective and scalable alternatives for enterprises looking to modernize their systems.

The agency conundrum: Do more and spend less

The growing workload of many agencies rests on outdated, proprietary IT systems that are not flexible enough to respond to new opportunities and challenges. Whether it’s functionality limitations, compute limitations, or the inability to harden against cyber threats, the solutions of yesteryear are the challenges of today that prevent agencies from future-proofing.
To add to the difficulty, cost is always a key ingredient in any agency procurement process. Since public funds means public responsibility, agencies face the challenge of maximizing their IT investments

money vs tech

The budgetary weight of vendor lock-in fueled by open source myths

There may be multiple valid reasons why an organization would select a proprietary software over open-source software, but one of the most persistent myths surrounding open-source software is security. The argument for increased vulnerabilities in open-source software may have been some merit for solutions that are in grassroots stages or no longer supported by the community. However for actively supported software, these security vulnerabilities are quickly identified and solved, precisely because the source code is public and everybody can contribute. In this regard, a community of developers can grow and strengthen an open source software much faster than a company with limited resources.

Mostly because of a wide user base and usually large support communities, open source solutions tend to grow faster. This growth can be in:

  • Versatility, by attracting more solution providers who build their add-ons for specific or core functionality improvements.
  • Code quality, thanks to different contributors collaborating on code sections, leveraging years of experience by developers from different industries and walks of life.
  • Security, because of vigilant peer review surrounding new updates, new features, code refactoring etc.

This is not to say that open source solutions are by default the better choice in every use case. Organizations will still want to do their due diligence in knowing who the contributors/developers of that software are, what was done with the code, how the software evolved, and what is the rate of response from the community.

Open source is the new normal

In 2016, the White House initiative to promote open source resulted in a growing repository of code that is developed by, or for government agencies. This formidable index of projects that are completely open source is an excellent resource for developing cost-effective solutions.

The private sector has not been idle either. Large companies are becoming leaders in open source projects. Open source solutions offer agencies reliable platforms that allow for interconnectivity and compatibility to new AI/ML solutions while making those business solutions more streamlined, says Christine Cox, regional vice president federal sales for SUSE.

The global phenomenon of acceptance toward open source has grown significantly. Most of the governments in the world have opened up to the open source software initiative.

connecting people

Open source brings connectivity, flexibility and cost-efficiency

Using open source technologies allows agencies increased interconnectivity, safely and reliably bringing secure data across multiple cloud environments. This helps agencies to automate repetitive data processing and management tasks, streamline business processes, and cut costs at the same time.

By using open source front and center, agencies benefit on several levels:

  • Lower IT expenses: when using open source software as a base, any company can provide upgrade and maintenance services. This opens up the gates for competitive bidding, helping agencies in reducing costs of ownership and maintenance compared to proprietary solutions.
  • Lower risk of ownership: unlike proprietary software where an agency completely depends on the current vendor, using open source relieves this risk. If an agency is not pleased with the support from the original vendor, they can opt for a different vendor that can support that open source solution.
  • No user count constraints: while most proprietary solution providers impose contractual constraints on the number of users, open-source software solution providers do not normally practice this. This means that agencies can open up access to the solution as needed without facing additional licensing costs.
  • Mutualization: by using open source solutions, agencies can leverage the large repository of already developed components and solutions used in other agencies. This standardization of features helps all agencies get better solutions without duplicating development work.
  • Agile security: proprietary software code is owned and accessed only by the company that built it. If hackers find a vulnerability, agencies are left to the vendor’s capacity to detect, identify and remedy the situation. With open source software, this limitation is not an issue. Agencies that have heard of a vulnerability that a vendor cannot remedy, are free to get other subject matter experts and resolve the problem faster.

These are just several of the benefits that agencies gain by using open source software. Tapping into the large repository of already built software used by other agencies, combined with the flexibility of companies that provide support and maintenance, puts agencies in a liberating position, both contractually and budget-wise.

How will your organization benefit from open source solutions?

Ultimately, all this translates into improved lives of taxpayers who enjoy better service, thanks to superior software solutions maintained by agile companies, who won a bid based on quality, reliability and cost.

If your organization is considering an IT Modernization initiative, we’d love to hear from you. Our staff is ready to hop on a non-obligatory call to discuss your business needs and solutions to capitalize on these opportunities.

ZyLab + ArkCase + eDiscovery: A Privacy Management Solution to Solve CCPA/GDPR Challenges

ZyLab + ArkCase + eDiscovery: A Privacy Management Solution to Solve CCPA/GDPR Challenges

ZyLab ArkCase eDiscovery A Privacy Management Solution to Solve CCPA/GDPR Challenges

With the emergence of data privacy laws in the USA and European Union, companies are facing an uphill battle. The Data Subject Access Request’s legal framework poses strict rules on processing and responding to requests. Failing to respond to a DSAR can mean serious financial penalties.

Under the General Data Protection Regulation (GDPR) in the EU, and its US equivalent, the California Consumer Privacy Act (CCPA), non-compliance fines can be as high as 20 million Euros or 4% of annual global organization revenue. Under CCPA, noncompliance is not directly sanctioned, but it keeps the door open for violations and lawsuits.

Despite such large fines, according to Egress, only 30% of business respondents are in regulatory compliance, and only another 27% plan to do it in 2020. A recent ZyLAB survey points out that for 45.4% of those implementing the DSAR solution, the biggest challenge is to remain compliant in the future.

In this text, we will cover the DSAR compliance topic and propose several off-the-shelf technologies that can provide a reliable, scalable and most importantly, cost-effective DSAR solution.

The greatest DSAR compliance challenge 

The biggest DSAR compliance challenge

We live in a time of corporate data explosion. Digitally connected, we are creating a digital trail wherever we go, whatever we do. Our civilization’s digital footprint doubles every 18 months.

With the introduction of DSAR, from an undisputed source of wealth and business value, data is now a regulation problem that requires fundamental changes in organizational behavior.

GDPR and CCPA create a whole set of new obligations that organizations can’t ignore.

  1. Citizens can use“Right of Access” (GDPR) or “Right to Know” (CCPA) to request if an organization possesses some personal data about them.
  2. Citizens can use the “Right to be forgotten” (GDPR) or “Right to Delete” (CCPA) to demand deletion of all their personal data that the organization possesses. Data controllers are obligated to do it “without undue delay, which means in a one-month time frame.
  3. Organizations should follow specific data management regulations such as:
    Strict cybersecurity requirements (mandatory data encryption, data security measures, report of breaches, etc.).
    Data processing rules.
    Redact or pseudonymize all sensitive information when there is no regulatory need to collect, possess, manage, or use it.
  4. They must ask the user for prior consent before the user’s personal information is collected and stored.
  5. All data breaches must be reported, and all subjects whose data has been breached must be informed. Organizations have 72 hours from discovery to notify authorities and must keep all records about it. Also, data subjects must be notified “without undue delay” when breaches have affected their unencrypted personal data. The CCPA adopts the individual cause of action or class action versus organizations that fail to adopt reasonable security practices to prevent data breaches.
  6. All third-party integrations must be in regulatory compliance, and the organization should be able to demonstrate that.

As we see, regulatory compliance is not an option. It is an obligation.

From the perspective of DSAR, organizations need to have a scalable system to get these requests, process them quickly and respond on time. However, with a growing data footprint stored in disconnected data sources like emails, chat systems and physical correspondence, searching for each requestor’s data is a daunting challenge. Luckily, there are advanced search solutions that can handle this kind of workload.

ZyLAB’s eDiscovery: The silver bullet on the data train?

 

ZYLAB’S-EDISCOVERY

In its core, eDiscovery is the collection, processing, and indexing of disparate content so that it can be thoroughly reviewed and redacted. Heavily used in the legal sector, eDiscovery enables organizations to scour large sources of data for Personally Identifiable Information (PII)—lightning fast.

When an organization receives a DSAR, the challenge lies in tracking all data sources for specific details, usually personally identifiable information and other data about the requestor and the data holder. These requests can be simple to process, but as organizations grow, so does the complexity of these requests.

In two examples, organizations have reported staggering costs of processing complex SAR requests. The first is from Nursing and Midwifery Council in the UK. A single, heavily-redacted DSAR costed about $315,000 in processing costs and legal fees. In another case, Oxford University faced a $150,000 cost in order to respond to a single SAR request due to the University needing to process over half a million emails in order to respond to the requestor, Dr. Cécile Deer.

Without a software solution that can process digital data and find personally identifiable information of the requestor (all while masking other individuals’ PII), responding to SAR requests can be extremely expensive for organizations.

Therefore, an eDiscovery solution like ZyLAB ONE is good as a silver bullet for DSAR challenges. The process of finding any document in various locations, fast and on scale is essential for a reliable DSAR solution.

Without using eDiscovery, any DSAR solution would struggle with the search functionality, which is essential for timely DSAR processing.

eDiscovery is responsible for:

  • Locating and processing all relevant data across all repositories, e-mails, etc.
  • Redacting all personally identifiable information related to other individuals mentioned in the same content.
  • Collecting information directly from the relevant organization’s sources with true data integrity.
  • De-duplicating the information. With any DSAR search, the portion of duplicate documents can be up to 80%. Deduplication eliminates a huge portion of work, therefore speeding up the DSAR response time.
  • Automatically unpacking containers of files and making every component searchable.
  • Enriching non-searchable data such as scans, images, media files or unsearchable PDFs, so that all information can be searched and used.
  • Analyzing, classifying and organizing information for a quick and comprehensive review.
  • Using auto-redaction to anonymize or pseudonymize personal and confidential information. This is crucial for data transfer outside of the EU. While anonymization is a more robust solution, after redaction the data subject is no longer identifiable. With pseudonymization, data can no longer be attributed to a data subject. Additional information used to identify is kept separate and subject to technical and organizational security measures. Only when the identifiers are reunited with the core data will it be safeguarded like any other personal data. Otherwise, a non-attribution must be provided.
  • Automatically converting all electronic file formats to one standard format before redactions.
  • Detailed tracking and reporting provide a complete audit trail to prove requested personal data erasure.

These are the key features to solve the GDPR/CCPA bottleneck for finding any document across all locations in the organization. It’s important to note that not all eDiscovery solutions have all these features incorporated.

ZyLAB ONE’s AI-powered eDiscovery combines advanced search, text-mining, auto-classification, natural language processing (NLP) and machine learning. Using these procedures, ZyLAB ONE can cull information from archives to ascertain what information can be destroyed without harming the business, historical or legal need for that data.

ZyLAB ONE eDiscovery can scale out and manage search over large clusters of machines. Both indexing and searching can be distributed over as many machines as desired, and indexes can be centralized or distributed for better performance or robustness needs.

This results in almost unlimited scalability of the search engine. Depending on the hardware, ZyLAB can index multiple terabytes of data in a matter of just a few hours. At the same time, it maintains the ability to search faster than any other product for large queries containing positional operators, Boolean, quorum search, wildcards, and fuzzy matching (also at the beginning of words), complex regular expressions, parsing and tokenization flexibility.

Support functions like index checksums, monitoring index status tools, the environment status tool, and the current running status help in the control and maintenance.

Having a powerful eDiscovery component alone, however, isn’t enough for a solid DSAR solution. Organizations need to marry this search feature with a system that can capture DSAR requests, use workflows and automation to process these requests at-scale.

DSAR Management with the ArkCase Case Management Platform

 

ArkCase DSAR Solution

One of the hallmarks of DSAR requests management, other than being data-intensive, is that it has a relatively fixed workflow:

  • It all starts with a public portal where people can fill out a DSAR request.
  • Next, there is a mandatory identity verification to confirm that the requester is the data subject. Then, the request is queued for processing.
  • The processing has a fixed workflow of finding the data, deduplication, redaction, review, and delivery.
  • The data subject can respond with a deletion request that the company can respond with proof of deletion beyond recovery.
  • Lastly, the entire process from submission to closure should be auditable, meaning that at every stage of the workflow, log entries are recorded and stored securely.

Software solutions that are preconfigured with workflows and forms are also heavily used in medical and legal practices. These case management solutions enable organizations to automate repetitive tasks, streamline workflows, leverage collaboration and use the cloud for global yet secure access.

One of our favorite case management platforms is ArkCase. It is a modern, open-source case management platform that accelerates case processing time. Thanks to its flexibility, ArkCase offers many off-the-shelf solutions such as data privacy management, FOIA requests management, complaint management, correspondence management, legal case management, etc.

ArkCase is a robust platform that comes with a personalized dashboard, document management capabilities, collaboration, rules engine, configurable and pre-configured workflows, advanced search, reporting, calendaring, task management, multimedia search and is fully auditable. It is an open-source solution that is field-tested, cost-effective, and future proof.

  • Content/Records Management System
  • Robotic Process Automation (RPA)
  • Analytics
  • Correspondence Management
  • Modern eDiscovery

With these integrations, the ArkCase DSAR Solution claims a processing time savings of 60%.  Without the DSAR Solution, the cost of manually processing privacy requests is $1,400 per request.

With flexible licensing and pricing, ArkCase DSAR can be an excellent way to achieve full CCPA and GDPR compliance without breaking the budget.

Wrap-Up: The Combined Benefits Of ZyLAB ONE And ArkCase 

Benefits-Of-ZyLAB-And-ArkCase

As a result of legal frameworks such as GDPR and CCPA, companies are facing an ever-growing amount of data disclosure requests governed by DSAR. Companies that gather and store a large volumes of user data will find it difficult to respond to these requests on time, even if all their data is digitally stored.

Finding all personally identifiable information related to the requestor, while redacting all other PII from other individuals mentioned in the requested documents, is a daunting task that cannot be solved with increasing the workforce alone. Therefore, organizations turn to scalable technologies like ZyLAB ONE and ArkCase. ZyLAB ONE provides a reliable and fast eDiscovery search, while ArkCase enables people to work optimally, one case at a time.

ZyLAB ONE eDiscovery has the most scalable and flexible architecture on the market. ZyLAB ONE easily handles large data volumes. The total system capacity can be scaled up by assigning as many virtual machines as needed to increase the computing capacity. As a SaaS-based eDiscovery solution, it is suitable for thin-client and remote work use, but it can also be implemented on-premise or hybrid.

ZyLAB ONE eDiscovery provides seamless integration for an efficient process without interruption. A flexible architecture “follows the wave of data” through the eDiscovery system during a project. Thanks to this flexible architecture, ZyLAB ONE provides a future-proof solution for processing large amounts of data, ensuring reliable eDiscovery functionality.

ArkCase is a FedRAMP Moderate open-source platform with a proven track record. As a cloud DSAR solution, it is suitable for thin-client use and remote work use, but it can also be implemented on-premise or hybrid. Federated Search is implemented as an information retrieval technology that allows the simultaneous search of multiple searchable resources.

Combined, the two provide a scalable DSAR solution that provides an organized, central location where all data is standardized and where all your Production Readiness Review (PRR) processes begin and end. A platform that gives control over data access, document review, redaction and the ability to request status at any point in time. All actions are documented and traceable preventing any possible litigation.

If you’re interested in finding out more details about how Armedia can help as a Solutions Integrator and solve your Data Privacy Management needs, contact us for a no-obligation consultation.

 

Implementing Zero Trust Architecture With Armedia

Implementing Zero Trust Architecture With Armedia

Overview

Traditional network security protocols involve strategies to keep malicious actors out of the network but allow almost unrestricted access to users and devices inside. These traditional architectures leverage legacy technologies such as firewalls, virtual private networks (VPNs) and network access control (NAC) to build multiple security layers on the perimeters. So basically they trust users that are inside their infrastructure and verify those who are outside. If an attacker is able to gain access and enter the internal network then he can have access to the entire infrastructure.

Implementing zero trust architecture with Armedia

Let’s take a look at some attack progressions to see why it is not enough

  1. Phishing emails attack on employees.
  2. Compromised privileged machine
  3. Keylogger installed on corporate machines
  4. Developer password is compromised
  5. The production environment is compromised using privileged machine
  6. Database credentials compromised
  7. Exfiltration via compromised cloud hosting service and so on

These are the threats that come from within the network. How do you plan to prevent such scenarios?

The answer is Zero Trust Architecture.

What is a Zero Trust Architecture?

zero trust architecture

“Zero trust” is a term that was coined by John Kindervag in 2010. He proposed that companies should move away from perimeter-centric network security approaches to a model that involves continuous trust verification mechanisms across every user, device, layer, and application.

(source: https://www.csoonline.com/article/3247848/what-is-zero-trust-a-model-for-more-effective-security.html)

ZTA takes the “Never trust, Always verify” approach to implement strict identity verification for users and devices when they access resources either from inside the network perimeter or outside.

Once a user or device is inside the network, Zero Trust Architecture implements protocols for limited access to prevent malicious activities if the entity happens to be an attacker. Thus, if a security breach happens, it can not propagate to the whole infrastructure, as it happens in traditional network security architectures.

ZTA makes an assumption that the network is in a compromised state and every user and device must go through strict identity verification to prove that they are not malicious actors. This model treats all actors as an external actor and continuously challenges them to verify trust. Once verified, only required access is given.

How to implement a Zero Trust Architecture

zero trust architecture ZTA

Zero Trust Architecture is neither a technology, nor it is attached to any specific technology. It is a holistic strategy and approach to implementing network security based on several fundamental assertions { source: NIST 800-207 Draft}

  1. All computing services and data sources are considered as resources.
  2. No implied trust based on network locality.
  3. The network is always considered to be compromised.
  4. The network is under constant internal and external threats.
  5. Authenticate and authorize every user, device, and network flow.
  6. Strict trust verification before accessing each individual resource.
  7. Connection-based restricted access to each individual resource.
  8. Resource access is determined by Identity, behavioral attributes, and dynamic policies.
  9. The organization makes sure that all systems are well-secured.
  10. Monitor systems to make sure that they are well-secured.

What do all these principles mean? Well, this means that there should be no trust between any individual resource and the entity trying to access it. Hence the name Zero Trust Architecture. Implementing it requires leveraging multiple technologies to challenge and prove User trust, Device trust, Session trust, Application trust, and Data trust.

  1. Least-privileged access- Zero trust model requires defining privilege limits for users, devices, network flow, and applications, etc. Each user and device should have minimum privileges and access rights required for them to perform their jobs on a need-to-know basis.

A comprehensive audit should be done to get a clear picture of privileges for every entity in the network who needs access. This is a key security control in Zero Trust Architecture so the access-list must always be up-to-date.

  1. Network security policies: All standard network security policies must be in place in addition to Zero trust policies. They should also be tested regularly for effectiveness and vulnerabilities.
  2. Log and Inspect Traffic – All the activities and traffic mush be logged, monitored, and inspected continuously. Automation should be adopted to perform these operations faster and efficiently.
  3. Risk management and Threat Detection – Security analytics system must be in place to monitor suspicious activities based on monitoring, policies, behavior, and Risk-adaptive controls. Proactive threat detection and resolution should be the norm.

A Zero Trust Architecture implementation for network security must address the following

  1. Micro-segmentation – Divide network/data center into smaller individual parts that can be secured with different access credentials and policies. This should be done additionally with traditional security protocols such as VPNs, NAC, Firewalls, etc. This increases the security multi-folds by preventing bad actors from going on a malicious spree throughout the network even if they compromised one part.
  2. Verify Users and Devices – Users and devices both must undergo a strict authentication and authorization process based on the Eliminated Trust Validation approach. Any user or device can not be trusted and granted access to any resource if they are not on the access-list. Both must comply with security protocols and devices should have up-to-date software, malware and virus protection, updated patches, and encryption, etc.
  3. Multi-factor authentication – MFA is definitely more secure than just a password, for example, Biometrics and OTP are MFA methods. Devices supporting the MFA are increasing day by day. Almost all smartphones support it.

Multi-factor authentication is an effective way of performing Eliminated trust validation by adding an extra layer of security.

The Armedia way

We, at Armedia, specializes in implementing Zero Trust Architecture for network security. We have been constantly evolving our Zero trust security model with the best technologies and policies. In addition to the core components in an enterprise implementing a ZTA, several data sources provide input and policy rules used by the policy engine when making access decisions. These include local data sources as well as external (i.e., nonenterprise-controlled or -created) data sources.

 

Armedia Zero Trust Architecture

These include:

  1. Continuous diagnostics and mitigation (CDM) system: This gathers information about the enterprise asset’s current state and applies updates to configuration and software components. An enterprise CDM system provides the policy engine with the information about the asset making an access request, such as whether it is running the appropriate patched operating system (OS) and applications or whether the asset has any known vulnerabilities.
  • Armedia does not have a centralized policy engine and policy administrator for all resources; this is in part due to the significant effort to centralize all policy making and execution
  • We use Zabbix monitoring and altering, along with Grafana dashboards for visuals
  • We use OSSEC/Wazuh on Linux for intrusion detection; we abandoned the use of aide on Linux since to resource usage
  • We use Windows Endpoint Security on Windows Servers
  • We use ManageEngine Desktop Central to manage and patch Windows Servers
  • We use Red Hat Satellite to manage and patch Linux Servers
  • Systems and servers are patched and restarted on a scheduled basis
  1. Industry compliance system: This ensures that the enterprise remains compliant with any regulatory regime that it may fall under (e.g., FISMA, healthcare or financial industry information security requirements). This includes all the policy rules that an enterprise develops to ensure compliance.
  • Armedia uses Tenable Nessus with scanning profiles based on DISA SRG’s and STIG’s
  • Armedia uses Active Directory Group Policy Objects (GPOs) to ensure and reassert compliance within Windows Servers by managing key configuration files, security settings, application settings based on data stored in Configuration Management
  • Armedia uses Puppet within Red Hat Satellite to ensure and reassert compliance within Linux Servers by managing key configuration files, security settings, application settings based on data stored in Configuration Management
  1. Threat intelligence feed(s): This provides information from internal or external sources that help the policy engine make access decisions. These could be multiple services that take data from internal and/or multiple external sources and provide information about newly discovered attacks or vulnerabilities. This also includes blacklists, newly identified malware, and reported attacks to other assets that the policy engine will want to deny access to from enterprise assets.
  • Armedia uses GeoIP data on its perimeter firewalls, VPN, web application, and secure file transfer resources to blacklist regions, countries, and sites that are known threats, along with sites that perform scans or launch attacks
  1. Data access policies: These are the attributes, rules, and policies about access to enterprise resources. This set of rules could be encoded in or dynamically generated by the policy engine. These policies are the starting point for authorizing access to a resource as they provide the basic access privileges for accounts and applications in the enterprise. These policies should be based on the defined mission roles and needs of the organization.
  • Armedia does not have a centralized policy engine and policy administrator for all resources; this is in part due to the significant effort to centralize all policy making and execution
  • Data access policies are defined within the perimeter firewall for permitted inbound and outbound access; policies specifying access are assigned based on Directory group memberships
  • Data access policies are set based on Directory group assignments that align to client, customer, corporate resources for development, test, preproduction, and production environments
  • Data access policies are set within Infrastructure at the network and storage layers
  1. Enterprise public key infrastructure (PKI): This system is responsible for generating and logging certificates issued by the enterprise to resources, subjects, and applications. This also includes the global certificate authority ecosystem and the Federal PKI,4 which may or may not be integrated with the enterprise PKI. This could also be a PKI that is not built upon X.509 certificates.
  • Directory, Server, Infrastructure, and Network resources trust well-established Third Party Root Certification Authorities

– Federal PKI Root Certication Authorities resources not included can be added upon request and subsequent approval

  • Use Active Directory Certificate Services to manage and issue server, application, and user certificates for all resources in the environment

-Leverage integration with Active Directory for enrollment of Windows Servers, users, and applications

-Leverage automation within Linux Servers for server and application certificate management

  1. Data access policies: These are the attributes, rules, and policies about access to enterprise resources. This set of rules could be encoded in or dynamically generated by the policy engine. These policies are the starting point for authorizing access to a resource as they provide the basic access privileges for accounts and applications in the enterprise. These policies should be based on the defined mission roles and needs of the organization.
  • Armedia does not have a centralized policy engine and policy administrator for all resources; this is in part due to the significant effort to centralize all policy making and execution
  • Data access policies are defined within the perimeter firewall for permitted inbound and outbound access; policies specifying access are assigned based on Directory group memberships
  • Data access policies are set based on Directory group assignments that align to client, customer, corporate resources for development, test, preproduction, and production environments
  • Data access policies are set within Infrastructure at the network and storage layers
  1. ID management system: This is responsible for creating, storing, and managing enterprise user accounts and identity records (e.g., lightweight directory access protocol (LDAP) server). This system contains the necessary user information (e.g., name, email address, certificates) and other enterprise characteristics such as role, access attributes, and assigned assets. This system often utilizes other systems (such as a PKI) for artifacts associated with user accounts. This system may be part of a larger federated community and may include nonenterprise employees or links to nonenterprise assets for collaboration.
  • Use Active Directory for user authentication along with group, container, organizational unit (OU), and organization management

– Desktops, servers (Windows and Linux) are managed using AD – only exceptions are servers placed in a designated DMZ

  • Use Active Directory for user authorization based on group assignments; groups are defined for application-specific and system access based on the principle of least privilege
  • Use Active Directory Federation Services for federated authentication

– Permit access only for named and authorized users

  • Use Active Directory Certificate Services to manage and issue server, application, and user certificates for all resources in the environment

– Leverage integration with Active Directory for enrollment of Windows Servers, users, and applications

– Leverage automation within Linux servers for server and application certificate management

  1. Network and system activity logs: This is the enterprise system that aggregates asset logs, network traffic, resource access actions, and other events that provide real-time (or near-real-time) feedback on the security posture of enterprise information systems.
  • Logs are collected into an Elastic Stack deployment, have correlation applied, and are then surfaced through Kibana dashboards
  1. Security information and event management (SIEM) system: This collects security-centric information for later analysis. This data is then used to refine policies and warn of possible attacks against enterprise assets.
  • Armedia leverages Splunk and Elastic Stack for deriving and surfacing SIEM information

Establishing User Trust

When someone makes a request to access the network with any protocol, for example – VPN software, HTTPS, or TLS, We take the following actions based on the use-case

  1. Requests from unauthorized sources (a user or device), such as incorrect VPN, are rejected based on the policies.
  2. Once the request source is authenticated, a session is established and requests with correct session ID are granted role-based access.
  3. We use multiple protocols and technologies, such as secure LDAP, Kerberos, SAML, TLS over JDBC, ActiveMQ JMS, SSH, and web-based applications, etc

Confirming State of the Device

  1. VPN access by a client to a server – Necessary information is collected from client software to determine if the client device meets a security baseline to grant access. If the baseline is not met, the VPN server will reject the user without even presenting an authentication challenge.
  2. Browser with an SSO aware application- Necessary information is collected from the client’s browser for security baseline check and filtering out curl/wget requests. LDAP, SAML, and Kerberos, etc. are used to grant application access based on the use-case.
  3.  Developers with SSH connection – We leverage Kerberos for easy access and generic security service application program Interface (GSSAPI). Multi-factor authentication, encryption, and access-list protocols are also used to provide access depending on the use-case.

Advantages and benefits of Zero Trust Architecture

Our way of implementing the Zero Trust Architecture for network security, business processes, services, and systems has much-needed capabilities to enable the following advantages and benefits – {Source: 19 and 20 NIST 800-207 draft}

  1. Prevention of data and security breaches.
  2. Minimize lateral movement using the principle of micro-segmentation
  3. Security measures and protection can be easily expanded across multiple layers regardless of the underlying infrastructure.
  4. Get insights and analytics for users, workloads, devices, and components across the network and environments to enforce the required policy.
  5. Logging, monitoring, reporting continuously to detect the threats and timely response.
  6. Minimizing exposure and increasing security compliances.
  7. Minimized security gaps by covering a wide variety of attack points.
  8. Increased business agility to securely adopt cloud solutions.
  9. It requires less management and skill set than traditional security architectures.
  10. Save time, money, and efforts.

On a final note, In today’s digital landscape, organizations need to evolve their security protocols to fight off any malicious actor, inside the network or outside the network. Shifting to Zero Trust Architecture will enable you to protect your valuable network and data assets. The zero trust model enhances the security of an organization and provides it with substantial business advantages and benefits