STARS III Press Release

STARS III Press Release

GSA Awards Armedia a Spot on $50B STARS III Contract

VIENNA, Va, June 29, 2021—Armedia, through its Hartwood–Armedia Joint Venture with Hartwood Consulting Group, was selected by the General Services Administration (GSA) for a spot on its $50 billion STARS III contract. The governmentwide acquisition contract (GWAC) is a best-in-class, easy-to-use contract that allows agencies to procure IT from pre-vetted prime contractors. The STARS III contract is expected to focus on emerging technologies and technologies outside of the continental United States, in addition to a full range of Information Technology services.

“We are excited to partner with our protégé, Hartwood Consulting Group, on the STARS III GWAC. This gives us another Best in Class vehicle on which we can service our current and future clients, while continuing to strengthen the relationship with our Protege. The combination of Armedia and Hartwood offers a unique mix of technical excellence and responsive PMO support geared toward both Federal Civilian and DoD agencies,” stated Jan Coley, Vice President of Business Development at Armedia.

Hartwood-Armedia Joint Venture (HAJV)

Hartwood-Armedia Joint Venture (HAJV), an SBA approved 8(a) and SDVOSB JV combines two companies that are complementary in capabilities, culture and vision. Our federal clients require a partner who understands the technological landscape and has proven past performance supporting application sustainment and platform operations and maintenance within that environment. Our mission of supporting our clients in transforming their operations through management of data, process improvement and seamless integration through the use of innovation has been proven to be successful with the USDA, Department of Defense, US Senate, Internal Revenue Service, as well as the US Marine Corps. Learn more at

hartwood armedia joint venture

About Hartwood

Hartwood Consulting Group, a VA-certified Service-Disabled Veteran-Owned Small Business (SDVOSB) and an 8(a) firm, helps government organizational experts understand their services and systems better. Hartwood offers an accomplished range of professional and technical capabilities in collaborative support with the government. This includes all aspects of system augmentation, configuration, assessment and support for the RD applications and databases. Hartwood personnel have combined to provide 100+ “rapid application development” solutions, incorporating modified approaches of Agile, spiral and waterfall methodology for the USMC HQMC. Learn more at

hartwood consulting group

About Armedia

Armedia, LLC is a Veteran Owned Small Business and a long-term solution provider to the US Federal Government. Founded in 2002, Armedia is a technology advisory firm, focused on assisting organizations build the systems and the cultures necessary to manage and harness their data. We provide the expertise in process improvement and how to support that improvement through quality solutions and effective change management. In addition, we also provide expertise and services in content intelligence, big data analytics, Agile software development, and open source and mobile technologies. Appraised at CMMI Dev Level 3 and ISO 9001:2015 certified, Armedia brings rigorous process discipline to all projects. In addition, Armedia holds a TS facilities clearance.

armedia logo

Future-proofing organizations with open source software solutions

Future-proofing organizations with open source software solutions

open source software solution

Procurement practices using vendor lock-in technologies may thwart the ability to change with mission objectives and cost-efficiency of IT modernization initiatives, resulting in slowing the government agencies’ ability to face new challenges. Fueled by misconceptions, open-source solutions used to be overlooked as viable alternatives to proprietary software within large enterprises. With the successes of Linux, Java, Apache, ArkCase, Drupal, WordPress, Alfresco, Nuxeo, and many others, open-source technologies have proven to be secure, cost-effective and scalable alternatives for enterprises looking to modernize their systems.

The agency conundrum: Do more and spend less

The growing workload of many agencies rests on outdated, proprietary IT systems that are not flexible enough to respond to new opportunities and challenges. Whether it’s functionality limitations, compute limitations, or the inability to harden against cyber threats, the solutions of yesteryear are the challenges of today that prevent agencies from future-proofing.
To add to the difficulty, cost is always a key ingredient in any agency procurement process. Since public funds means public responsibility, agencies face the challenge of maximizing their IT investments

money vs tech

The budgetary weight of vendor lock-in fueled by open source myths

There may be multiple valid reasons why an organization would select a proprietary software over open-source software, but one of the most persistent myths surrounding open-source software is security. The argument for increased vulnerabilities in open-source software may have been some merit for solutions that are in grassroots stages or no longer supported by the community. However for actively supported software, these security vulnerabilities are quickly identified and solved, precisely because the source code is public and everybody can contribute. In this regard, a community of developers can grow and strengthen an open source software much faster than a company with limited resources.

Mostly because of a wide user base and usually large support communities, open source solutions tend to grow faster. This growth can be in:

  • Versatility, by attracting more solution providers who build their add-ons for specific or core functionality improvements.
  • Code quality, thanks to different contributors collaborating on code sections, leveraging years of experience by developers from different industries and walks of life.
  • Security, because of vigilant peer review surrounding new updates, new features, code refactoring etc.

This is not to say that open source solutions are by default the better choice in every use case. Organizations will still want to do their due diligence in knowing who the contributors/developers of that software are, what was done with the code, how the software evolved, and what is the rate of response from the community.

Open source is the new normal

In 2016, the White House initiative to promote open source resulted in a growing repository of code that is developed by, or for government agencies. This formidable index of projects that are completely open source is an excellent resource for developing cost-effective solutions.

The private sector has not been idle either. Large companies are becoming leaders in open source projects. Open source solutions offer agencies reliable platforms that allow for interconnectivity and compatibility to new AI/ML solutions while making those business solutions more streamlined, says Christine Cox, regional vice president federal sales for SUSE.

The global phenomenon of acceptance toward open source has grown significantly. Most of the governments in the world have opened up to the open source software initiative.

connecting people

Open source brings connectivity, flexibility and cost-efficiency

Using open source technologies allows agencies increased interconnectivity, safely and reliably bringing secure data across multiple cloud environments. This helps agencies to automate repetitive data processing and management tasks, streamline business processes, and cut costs at the same time.

By using open source front and center, agencies benefit on several levels:

  • Lower IT expenses: when using open source software as a base, any company can provide upgrade and maintenance services. This opens up the gates for competitive bidding, helping agencies in reducing costs of ownership and maintenance compared to proprietary solutions.
  • Lower risk of ownership: unlike proprietary software where an agency completely depends on the current vendor, using open source relieves this risk. If an agency is not pleased with the support from the original vendor, they can opt for a different vendor that can support that open source solution.
  • No user count constraints: while most proprietary solution providers impose contractual constraints on the number of users, open-source software solution providers do not normally practice this. This means that agencies can open up access to the solution as needed without facing additional licensing costs.
  • Mutualization: by using open source solutions, agencies can leverage the large repository of already developed components and solutions used in other agencies. This standardization of features helps all agencies get better solutions without duplicating development work.
  • Agile security: proprietary software code is owned and accessed only by the company that built it. If hackers find a vulnerability, agencies are left to the vendor’s capacity to detect, identify and remedy the situation. With open source software, this limitation is not an issue. Agencies that have heard of a vulnerability that a vendor cannot remedy, are free to get other subject matter experts and resolve the problem faster.

These are just several of the benefits that agencies gain by using open source software. Tapping into the large repository of already built software used by other agencies, combined with the flexibility of companies that provide support and maintenance, puts agencies in a liberating position, both contractually and budget-wise.

How will your organization benefit from open source solutions?

Ultimately, all this translates into improved lives of taxpayers who enjoy better service, thanks to superior software solutions maintained by agile companies, who won a bid based on quality, reliability and cost.

If your organization is considering an IT Modernization initiative, we’d love to hear from you. Our staff is ready to hop on a non-obligatory call to discuss your business needs and solutions to capitalize on these opportunities.

ZyLab + ArkCase + eDiscovery: A Privacy Management Solution to Solve CCPA/GDPR Challenges

ZyLab + ArkCase + eDiscovery: A Privacy Management Solution to Solve CCPA/GDPR Challenges

ZyLab ArkCase eDiscovery A Privacy Management Solution to Solve CCPA/GDPR Challenges

With the emergence of data privacy laws in the USA and European Union, companies are facing an uphill battle. The Data Subject Access Request’s legal framework poses strict rules on processing and responding to requests. Failing to respond to a DSAR can mean serious financial penalties.

Under the General Data Protection Regulation (GDPR) in the EU, and its US equivalent, the California Consumer Privacy Act (CCPA), non-compliance fines can be as high as 20 million Euros or 4% of annual global organization revenue. Under CCPA, noncompliance is not directly sanctioned, but it keeps the door open for violations and lawsuits.

Despite such large fines, according to Egress, only 30% of business respondents are in regulatory compliance, and only another 27% plan to do it in 2020. A recent ZyLAB survey points out that for 45.4% of those implementing the DSAR solution, the biggest challenge is to remain compliant in the future.

In this text, we will cover the DSAR compliance topic and propose several off-the-shelf technologies that can provide a reliable, scalable and most importantly, cost-effective DSAR solution.

The greatest DSAR compliance challenge 

The biggest DSAR compliance challenge

We live in a time of corporate data explosion. Digitally connected, we are creating a digital trail wherever we go, whatever we do. Our civilization’s digital footprint doubles every 18 months.

With the introduction of DSAR, from an undisputed source of wealth and business value, data is now a regulation problem that requires fundamental changes in organizational behavior.

GDPR and CCPA create a whole set of new obligations that organizations can’t ignore.

  1. Citizens can use“Right of Access” (GDPR) or “Right to Know” (CCPA) to request if an organization possesses some personal data about them.
  2. Citizens can use the “Right to be forgotten” (GDPR) or “Right to Delete” (CCPA) to demand deletion of all their personal data that the organization possesses. Data controllers are obligated to do it “without undue delay, which means in a one-month time frame.
  3. Organizations should follow specific data management regulations such as:
    Strict cybersecurity requirements (mandatory data encryption, data security measures, report of breaches, etc.).
    Data processing rules.
    Redact or pseudonymize all sensitive information when there is no regulatory need to collect, possess, manage, or use it.
  4. They must ask the user for prior consent before the user’s personal information is collected and stored.
  5. All data breaches must be reported, and all subjects whose data has been breached must be informed. Organizations have 72 hours from discovery to notify authorities and must keep all records about it. Also, data subjects must be notified “without undue delay” when breaches have affected their unencrypted personal data. The CCPA adopts the individual cause of action or class action versus organizations that fail to adopt reasonable security practices to prevent data breaches.
  6. All third-party integrations must be in regulatory compliance, and the organization should be able to demonstrate that.

As we see, regulatory compliance is not an option. It is an obligation.

From the perspective of DSAR, organizations need to have a scalable system to get these requests, process them quickly and respond on time. However, with a growing data footprint stored in disconnected data sources like emails, chat systems and physical correspondence, searching for each requestor’s data is a daunting challenge. Luckily, there are advanced search solutions that can handle this kind of workload.

ZyLAB’s eDiscovery: The silver bullet on the data train?



In its core, eDiscovery is the collection, processing, and indexing of disparate content so that it can be thoroughly reviewed and redacted. Heavily used in the legal sector, eDiscovery enables organizations to scour large sources of data for Personally Identifiable Information (PII)—lightning fast.

When an organization receives a DSAR, the challenge lies in tracking all data sources for specific details, usually personally identifiable information and other data about the requestor and the data holder. These requests can be simple to process, but as organizations grow, so does the complexity of these requests.

In two examples, organizations have reported staggering costs of processing complex SAR requests. The first is from Nursing and Midwifery Council in the UK. A single, heavily-redacted DSAR costed about $315,000 in processing costs and legal fees. In another case, Oxford University faced a $150,000 cost in order to respond to a single SAR request due to the University needing to process over half a million emails in order to respond to the requestor, Dr. Cécile Deer.

Without a software solution that can process digital data and find personally identifiable information of the requestor (all while masking other individuals’ PII), responding to SAR requests can be extremely expensive for organizations.

Therefore, an eDiscovery solution like ZyLAB ONE is good as a silver bullet for DSAR challenges. The process of finding any document in various locations, fast and on scale is essential for a reliable DSAR solution.

Without using eDiscovery, any DSAR solution would struggle with the search functionality, which is essential for timely DSAR processing.

eDiscovery is responsible for:

  • Locating and processing all relevant data across all repositories, e-mails, etc.
  • Redacting all personally identifiable information related to other individuals mentioned in the same content.
  • Collecting information directly from the relevant organization’s sources with true data integrity.
  • De-duplicating the information. With any DSAR search, the portion of duplicate documents can be up to 80%. Deduplication eliminates a huge portion of work, therefore speeding up the DSAR response time.
  • Automatically unpacking containers of files and making every component searchable.
  • Enriching non-searchable data such as scans, images, media files or unsearchable PDFs, so that all information can be searched and used.
  • Analyzing, classifying and organizing information for a quick and comprehensive review.
  • Using auto-redaction to anonymize or pseudonymize personal and confidential information. This is crucial for data transfer outside of the EU. While anonymization is a more robust solution, after redaction the data subject is no longer identifiable. With pseudonymization, data can no longer be attributed to a data subject. Additional information used to identify is kept separate and subject to technical and organizational security measures. Only when the identifiers are reunited with the core data will it be safeguarded like any other personal data. Otherwise, a non-attribution must be provided.
  • Automatically converting all electronic file formats to one standard format before redactions.
  • Detailed tracking and reporting provide a complete audit trail to prove requested personal data erasure.

These are the key features to solve the GDPR/CCPA bottleneck for finding any document across all locations in the organization. It’s important to note that not all eDiscovery solutions have all these features incorporated.

ZyLAB ONE’s AI-powered eDiscovery combines advanced search, text-mining, auto-classification, natural language processing (NLP) and machine learning. Using these procedures, ZyLAB ONE can cull information from archives to ascertain what information can be destroyed without harming the business, historical or legal need for that data.

ZyLAB ONE eDiscovery can scale out and manage search over large clusters of machines. Both indexing and searching can be distributed over as many machines as desired, and indexes can be centralized or distributed for better performance or robustness needs.

This results in almost unlimited scalability of the search engine. Depending on the hardware, ZyLAB can index multiple terabytes of data in a matter of just a few hours. At the same time, it maintains the ability to search faster than any other product for large queries containing positional operators, Boolean, quorum search, wildcards, and fuzzy matching (also at the beginning of words), complex regular expressions, parsing and tokenization flexibility.

Support functions like index checksums, monitoring index status tools, the environment status tool, and the current running status help in the control and maintenance.

Having a powerful eDiscovery component alone, however, isn’t enough for a solid DSAR solution. Organizations need to marry this search feature with a system that can capture DSAR requests, use workflows and automation to process these requests at-scale.

DSAR Management with the ArkCase Case Management Platform


ArkCase DSAR Solution

One of the hallmarks of DSAR requests management, other than being data-intensive, is that it has a relatively fixed workflow:

  • It all starts with a public portal where people can fill out a DSAR request.
  • Next, there is a mandatory identity verification to confirm that the requester is the data subject. Then, the request is queued for processing.
  • The processing has a fixed workflow of finding the data, deduplication, redaction, review, and delivery.
  • The data subject can respond with a deletion request that the company can respond with proof of deletion beyond recovery.
  • Lastly, the entire process from submission to closure should be auditable, meaning that at every stage of the workflow, log entries are recorded and stored securely.

Software solutions that are preconfigured with workflows and forms are also heavily used in medical and legal practices. These case management solutions enable organizations to automate repetitive tasks, streamline workflows, leverage collaboration and use the cloud for global yet secure access.

One of our favorite case management platforms is ArkCase. It is a modern, open-source case management platform that accelerates case processing time. Thanks to its flexibility, ArkCase offers many off-the-shelf solutions such as data privacy management, FOIA requests management, complaint management, correspondence management, legal case management, etc.

ArkCase is a robust platform that comes with a personalized dashboard, document management capabilities, collaboration, rules engine, configurable and pre-configured workflows, advanced search, reporting, calendaring, task management, multimedia search and is fully auditable. It is an open-source solution that is field-tested, cost-effective, and future proof.

  • Content/Records Management System
  • Robotic Process Automation (RPA)
  • Analytics
  • Correspondence Management
  • Modern eDiscovery

With these integrations, the ArkCase DSAR Solution claims a processing time savings of 60%.  Without the DSAR Solution, the cost of manually processing privacy requests is $1,400 per request.

With flexible licensing and pricing, ArkCase DSAR can be an excellent way to achieve full CCPA and GDPR compliance without breaking the budget.

Wrap-Up: The Combined Benefits Of ZyLAB ONE And ArkCase 


As a result of legal frameworks such as GDPR and CCPA, companies are facing an ever-growing amount of data disclosure requests governed by DSAR. Companies that gather and store a large volumes of user data will find it difficult to respond to these requests on time, even if all their data is digitally stored.

Finding all personally identifiable information related to the requestor, while redacting all other PII from other individuals mentioned in the requested documents, is a daunting task that cannot be solved with increasing the workforce alone. Therefore, organizations turn to scalable technologies like ZyLAB ONE and ArkCase. ZyLAB ONE provides a reliable and fast eDiscovery search, while ArkCase enables people to work optimally, one case at a time.

ZyLAB ONE eDiscovery has the most scalable and flexible architecture on the market. ZyLAB ONE easily handles large data volumes. The total system capacity can be scaled up by assigning as many virtual machines as needed to increase the computing capacity. As a SaaS-based eDiscovery solution, it is suitable for thin-client and remote work use, but it can also be implemented on-premise or hybrid.

ZyLAB ONE eDiscovery provides seamless integration for an efficient process without interruption. A flexible architecture “follows the wave of data” through the eDiscovery system during a project. Thanks to this flexible architecture, ZyLAB ONE provides a future-proof solution for processing large amounts of data, ensuring reliable eDiscovery functionality.

ArkCase is a FedRAMP Moderate open-source platform with a proven track record. As a cloud DSAR solution, it is suitable for thin-client use and remote work use, but it can also be implemented on-premise or hybrid. Federated Search is implemented as an information retrieval technology that allows the simultaneous search of multiple searchable resources.

Combined, the two provide a scalable DSAR solution that provides an organized, central location where all data is standardized and where all your Production Readiness Review (PRR) processes begin and end. A platform that gives control over data access, document review, redaction and the ability to request status at any point in time. All actions are documented and traceable preventing any possible litigation.

If you’re interested in finding out more details about how Armedia can help as a Solutions Integrator and solve your Data Privacy Management needs, contact us for a no-obligation consultation.


Implementing Zero Trust Architecture With Armedia

Implementing Zero Trust Architecture With Armedia


Traditional network security protocols involve strategies to keep malicious actors out of the network but allow almost unrestricted access to users and devices inside. These traditional architectures leverage legacy technologies such as firewalls, virtual private networks (VPNs) and network access control (NAC) to build multiple security layers on the perimeters. So basically they trust users that are inside their infrastructure and verify those who are outside. If an attacker is able to gain access and enter the internal network then he can have access to the entire infrastructure.

Implementing zero trust architecture with Armedia

Let’s take a look at some attack progressions to see why it is not enough

  1. Phishing emails attack on employees.
  2. Compromised privileged machine
  3. Keylogger installed on corporate machines
  4. Developer password is compromised
  5. The production environment is compromised using privileged machine
  6. Database credentials compromised
  7. Exfiltration via compromised cloud hosting service and so on

These are the threats that come from within the network. How do you plan to prevent such scenarios?

The answer is Zero Trust Architecture.

What is a Zero Trust Architecture?

zero trust architecture

“Zero trust” is a term that was coined by John Kindervag in 2010. He proposed that companies should move away from perimeter-centric network security approaches to a model that involves continuous trust verification mechanisms across every user, device, layer, and application.


ZTA takes the “Never trust, Always verify” approach to implement strict identity verification for users and devices when they access resources either from inside the network perimeter or outside.

Once a user or device is inside the network, Zero Trust Architecture implements protocols for limited access to prevent malicious activities if the entity happens to be an attacker. Thus, if a security breach happens, it can not propagate to the whole infrastructure, as it happens in traditional network security architectures.

ZTA makes an assumption that the network is in a compromised state and every user and device must go through strict identity verification to prove that they are not malicious actors. This model treats all actors as an external actor and continuously challenges them to verify trust. Once verified, only required access is given.

How to implement a Zero Trust Architecture

zero trust architecture ZTA

Zero Trust Architecture is neither a technology, nor it is attached to any specific technology. It is a holistic strategy and approach to implementing network security based on several fundamental assertions { source: NIST 800-207 Draft}

  1. All computing services and data sources are considered as resources.
  2. No implied trust based on network locality.
  3. The network is always considered to be compromised.
  4. The network is under constant internal and external threats.
  5. Authenticate and authorize every user, device, and network flow.
  6. Strict trust verification before accessing each individual resource.
  7. Connection-based restricted access to each individual resource.
  8. Resource access is determined by Identity, behavioral attributes, and dynamic policies.
  9. The organization makes sure that all systems are well-secured.
  10. Monitor systems to make sure that they are well-secured.

What do all these principles mean? Well, this means that there should be no trust between any individual resource and the entity trying to access it. Hence the name Zero Trust Architecture. Implementing it requires leveraging multiple technologies to challenge and prove User trust, Device trust, Session trust, Application trust, and Data trust.

  1. Least-privileged access- Zero trust model requires defining privilege limits for users, devices, network flow, and applications, etc. Each user and device should have minimum privileges and access rights required for them to perform their jobs on a need-to-know basis.

A comprehensive audit should be done to get a clear picture of privileges for every entity in the network who needs access. This is a key security control in Zero Trust Architecture so the access-list must always be up-to-date.

  1. Network security policies: All standard network security policies must be in place in addition to Zero trust policies. They should also be tested regularly for effectiveness and vulnerabilities.
  2. Log and Inspect Traffic – All the activities and traffic mush be logged, monitored, and inspected continuously. Automation should be adopted to perform these operations faster and efficiently.
  3. Risk management and Threat Detection – Security analytics system must be in place to monitor suspicious activities based on monitoring, policies, behavior, and Risk-adaptive controls. Proactive threat detection and resolution should be the norm.

A Zero Trust Architecture implementation for network security must address the following

  1. Micro-segmentation – Divide network/data center into smaller individual parts that can be secured with different access credentials and policies. This should be done additionally with traditional security protocols such as VPNs, NAC, Firewalls, etc. This increases the security multi-folds by preventing bad actors from going on a malicious spree throughout the network even if they compromised one part.
  2. Verify Users and Devices – Users and devices both must undergo a strict authentication and authorization process based on the Eliminated Trust Validation approach. Any user or device can not be trusted and granted access to any resource if they are not on the access-list. Both must comply with security protocols and devices should have up-to-date software, malware and virus protection, updated patches, and encryption, etc.
  3. Multi-factor authentication – MFA is definitely more secure than just a password, for example, Biometrics and OTP are MFA methods. Devices supporting the MFA are increasing day by day. Almost all smartphones support it.

Multi-factor authentication is an effective way of performing Eliminated trust validation by adding an extra layer of security.

The Armedia way

We, at Armedia, specializes in implementing Zero Trust Architecture for network security. We have been constantly evolving our Zero trust security model with the best technologies and policies. In addition to the core components in an enterprise implementing a ZTA, several data sources provide input and policy rules used by the policy engine when making access decisions. These include local data sources as well as external (i.e., nonenterprise-controlled or -created) data sources.


Armedia Zero Trust Architecture

These include:

  1. Continuous diagnostics and mitigation (CDM) system: This gathers information about the enterprise asset’s current state and applies updates to configuration and software components. An enterprise CDM system provides the policy engine with the information about the asset making an access request, such as whether it is running the appropriate patched operating system (OS) and applications or whether the asset has any known vulnerabilities.
  • Armedia does not have a centralized policy engine and policy administrator for all resources; this is in part due to the significant effort to centralize all policy making and execution
  • We use Zabbix monitoring and altering, along with Grafana dashboards for visuals
  • We use OSSEC/Wazuh on Linux for intrusion detection; we abandoned the use of aide on Linux since to resource usage
  • We use Windows Endpoint Security on Windows Servers
  • We use ManageEngine Desktop Central to manage and patch Windows Servers
  • We use Red Hat Satellite to manage and patch Linux Servers
  • Systems and servers are patched and restarted on a scheduled basis
  1. Industry compliance system: This ensures that the enterprise remains compliant with any regulatory regime that it may fall under (e.g., FISMA, healthcare or financial industry information security requirements). This includes all the policy rules that an enterprise develops to ensure compliance.
  • Armedia uses Tenable Nessus with scanning profiles based on DISA SRG’s and STIG’s
  • Armedia uses Active Directory Group Policy Objects (GPOs) to ensure and reassert compliance within Windows Servers by managing key configuration files, security settings, application settings based on data stored in Configuration Management
  • Armedia uses Puppet within Red Hat Satellite to ensure and reassert compliance within Linux Servers by managing key configuration files, security settings, application settings based on data stored in Configuration Management
  1. Threat intelligence feed(s): This provides information from internal or external sources that help the policy engine make access decisions. These could be multiple services that take data from internal and/or multiple external sources and provide information about newly discovered attacks or vulnerabilities. This also includes blacklists, newly identified malware, and reported attacks to other assets that the policy engine will want to deny access to from enterprise assets.
  • Armedia uses GeoIP data on its perimeter firewalls, VPN, web application, and secure file transfer resources to blacklist regions, countries, and sites that are known threats, along with sites that perform scans or launch attacks
  1. Data access policies: These are the attributes, rules, and policies about access to enterprise resources. This set of rules could be encoded in or dynamically generated by the policy engine. These policies are the starting point for authorizing access to a resource as they provide the basic access privileges for accounts and applications in the enterprise. These policies should be based on the defined mission roles and needs of the organization.
  • Armedia does not have a centralized policy engine and policy administrator for all resources; this is in part due to the significant effort to centralize all policy making and execution
  • Data access policies are defined within the perimeter firewall for permitted inbound and outbound access; policies specifying access are assigned based on Directory group memberships
  • Data access policies are set based on Directory group assignments that align to client, customer, corporate resources for development, test, preproduction, and production environments
  • Data access policies are set within Infrastructure at the network and storage layers
  1. Enterprise public key infrastructure (PKI): This system is responsible for generating and logging certificates issued by the enterprise to resources, subjects, and applications. This also includes the global certificate authority ecosystem and the Federal PKI,4 which may or may not be integrated with the enterprise PKI. This could also be a PKI that is not built upon X.509 certificates.
  • Directory, Server, Infrastructure, and Network resources trust well-established Third Party Root Certification Authorities

– Federal PKI Root Certication Authorities resources not included can be added upon request and subsequent approval

  • Use Active Directory Certificate Services to manage and issue server, application, and user certificates for all resources in the environment

-Leverage integration with Active Directory for enrollment of Windows Servers, users, and applications

-Leverage automation within Linux Servers for server and application certificate management

  1. Data access policies: These are the attributes, rules, and policies about access to enterprise resources. This set of rules could be encoded in or dynamically generated by the policy engine. These policies are the starting point for authorizing access to a resource as they provide the basic access privileges for accounts and applications in the enterprise. These policies should be based on the defined mission roles and needs of the organization.
  • Armedia does not have a centralized policy engine and policy administrator for all resources; this is in part due to the significant effort to centralize all policy making and execution
  • Data access policies are defined within the perimeter firewall for permitted inbound and outbound access; policies specifying access are assigned based on Directory group memberships
  • Data access policies are set based on Directory group assignments that align to client, customer, corporate resources for development, test, preproduction, and production environments
  • Data access policies are set within Infrastructure at the network and storage layers
  1. ID management system: This is responsible for creating, storing, and managing enterprise user accounts and identity records (e.g., lightweight directory access protocol (LDAP) server). This system contains the necessary user information (e.g., name, email address, certificates) and other enterprise characteristics such as role, access attributes, and assigned assets. This system often utilizes other systems (such as a PKI) for artifacts associated with user accounts. This system may be part of a larger federated community and may include nonenterprise employees or links to nonenterprise assets for collaboration.
  • Use Active Directory for user authentication along with group, container, organizational unit (OU), and organization management

– Desktops, servers (Windows and Linux) are managed using AD – only exceptions are servers placed in a designated DMZ

  • Use Active Directory for user authorization based on group assignments; groups are defined for application-specific and system access based on the principle of least privilege
  • Use Active Directory Federation Services for federated authentication

– Permit access only for named and authorized users

  • Use Active Directory Certificate Services to manage and issue server, application, and user certificates for all resources in the environment

– Leverage integration with Active Directory for enrollment of Windows Servers, users, and applications

– Leverage automation within Linux servers for server and application certificate management

  1. Network and system activity logs: This is the enterprise system that aggregates asset logs, network traffic, resource access actions, and other events that provide real-time (or near-real-time) feedback on the security posture of enterprise information systems.
  • Logs are collected into an Elastic Stack deployment, have correlation applied, and are then surfaced through Kibana dashboards
  1. Security information and event management (SIEM) system: This collects security-centric information for later analysis. This data is then used to refine policies and warn of possible attacks against enterprise assets.
  • Armedia leverages Splunk and Elastic Stack for deriving and surfacing SIEM information

Establishing User Trust

When someone makes a request to access the network with any protocol, for example – VPN software, HTTPS, or TLS, We take the following actions based on the use-case

  1. Requests from unauthorized sources (a user or device), such as incorrect VPN, are rejected based on the policies.
  2. Once the request source is authenticated, a session is established and requests with correct session ID are granted role-based access.
  3. We use multiple protocols and technologies, such as secure LDAP, Kerberos, SAML, TLS over JDBC, ActiveMQ JMS, SSH, and web-based applications, etc

Confirming State of the Device

  1. VPN access by a client to a server – Necessary information is collected from client software to determine if the client device meets a security baseline to grant access. If the baseline is not met, the VPN server will reject the user without even presenting an authentication challenge.
  2. Browser with an SSO aware application- Necessary information is collected from the client’s browser for security baseline check and filtering out curl/wget requests. LDAP, SAML, and Kerberos, etc. are used to grant application access based on the use-case.
  3.  Developers with SSH connection – We leverage Kerberos for easy access and generic security service application program Interface (GSSAPI). Multi-factor authentication, encryption, and access-list protocols are also used to provide access depending on the use-case.

Advantages and benefits of Zero Trust Architecture

Our way of implementing the Zero Trust Architecture for network security, business processes, services, and systems has much-needed capabilities to enable the following advantages and benefits – {Source: 19 and 20 NIST 800-207 draft}

  1. Prevention of data and security breaches.
  2. Minimize lateral movement using the principle of micro-segmentation
  3. Security measures and protection can be easily expanded across multiple layers regardless of the underlying infrastructure.
  4. Get insights and analytics for users, workloads, devices, and components across the network and environments to enforce the required policy.
  5. Logging, monitoring, reporting continuously to detect the threats and timely response.
  6. Minimizing exposure and increasing security compliances.
  7. Minimized security gaps by covering a wide variety of attack points.
  8. Increased business agility to securely adopt cloud solutions.
  9. It requires less management and skill set than traditional security architectures.
  10. Save time, money, and efforts.

On a final note, In today’s digital landscape, organizations need to evolve their security protocols to fight off any malicious actor, inside the network or outside the network. Shifting to Zero Trust Architecture will enable you to protect your valuable network and data assets. The zero trust model enhances the security of an organization and provides it with substantial business advantages and benefits

Enterprise & SaaS: Solving the Productivity Problem During COVID-19

Enterprise & SaaS: Solving the Productivity Problem During COVID-19

Enterprise and SaaS

The COVID-19 pandemic forced enterprises into a full work-from-home mode practically overnight. There was no time to test, evaluate and decide. We were thrown into this “new normal” where anything that can be done remotely is being done remotely.

“We’re being forced into the world’s largest work-from-home experiment and, so far, it hasn’t been easy for a lot of organizations to implement,” says Saikat Chatterjee, Senior Director at Gartner.

This “new normal” is testing every enterprise’s agility to its core. Two questions are key for every enterprise:

  • Can we continue working?
  • How do we ensure continuity of operations during the pandemic?

The stakes are high as enterprises struggle to adapt and minimize any losses in money, reputation, clients, employees, etc.

With a significant growth of interest for remote work, enterprises are scrambling to find ways to adapt their existing legacy systems to enable their workforce to work from home. The chatter around work from home is so high that it’s incomparable to any previous time, as the chart below shows. This study points out that “work from home” was a subject of 423 transcribed conversation in public companies:

I looked up his position title on LinkedIn and it just says “Senior Director” so we don’t need to include the advisory bit  [JS1] [JS1]

work from home

Technology plays the leading role in this digital transformation drama. And, according to Gartner’s research, a high 54% of HR leaders indicate poor technology and infrastructure as the main problem for efficient work from home implementation.

The challenges of legacy technology in a work from home setting

The first challenge that users of legacy systems face is their inability to provide work from home capabilities at an enterprise-wide scale. Simon Migliano, the head of research at once commented about enterprises’ abilities to provide VPN access to their entire workforce: We know of at least one company whose VPN capacity is 8,000 users…Now, they have over five times as many employees trying to connect, with predictably frustrating results.”

According to an estimation of Rob Smith, another Gartner analyst, around one-third of enterprises were without proper equipment and knowledge to work from home. Another one-third had no plan at all and had never planned ahead to create any kind of a telecommuting strategy…

Many of these enterprises who didn’t proactively update their systems and corporate culture for the growing work from home trend found themselves in a challenging situation. Their IT departments needed to quickly adapt existing, usually outdated, on-premise software solutions that were never built to provide telecommuting capabilities.

These on-premise enterprise content management solutions were built with the preconception that there will be people close by to monitor and manage infrastructure, processes, documents, etc. Such proprietary systems needed the physical presence of people, so they could work efficiently, reliably, and securely.

Security is another challenge for enterprises using outdated systems. This is especially the case in regulated industries where workers go through security routines to get to their desks. Now, with the work from home culture, all those physical security measures account for nothing as enterprises are forced to grant remote access to sensitive data and processes without a tried and tested security setup.

For all these problems, the “government, legal, insurance, banking and healthcare are all great examples,” says Sumir Karayi, CEO and founder of 1E. “Many companies and organizations in these industries are working on legacy systems and are using software that is not patched. Not only does this mean remote work is a security concern, but it makes working a negative, unproductive experience for the employee.

That leads us to another challenge: workforce productivity. The loss of productivity in enterprises that were not prepared for remote work is due to a lack of modern technology and a lack of employee training. An outdated enterprise infrastructure coupled with an untrained workforce for a work from home setting means enterprises found themselves in unknown territory with no battle plan.

Employees in unprepared enterprises are now juggling work and life challenges without any prior training. We’ll be reviewing these workforce challenges in the next blog post. For the time being, we’ll keep our focus on the technology side of the problem.

Proprietary and on-premise solutions should be replaced with modern Software-as-a-Service (SaaS) solutions.

Why is SaaS the solution for enterprise management?

Why is SaaS the solution for enterprise management

Secure data access from anywhere for the entire workforce is one of the key reasons why enterprises should consider cloud solutions. Picking the right SaaS solution coupled with proper training can lead to productivity growth and employee satisfaction. Consequently, this will result with improved customer satisfaction. Enterprises that have planned and prepared for a work from home culture can, in fact, see solid growth during this COVID-19 period.

In general, SaaS companies and providers are offering a variety of enterprise solutions, such as:

  • enterprise content management (ECM),
  • business process management (BPM),
  • customer relationship management (CRM),
  • document management,
  • case management,
  • payroll and billing processing,
  • human resource management

Today, according to 451 Research, the cloud is the new mainstream, with approximately 90% of organizations surveyed using some type of cloud service. In 2019, around 60% of all workloads were running on some form of a hosted cloud service. This represents a huge rise from 45% just a year ago. Amazon Web Services (AWS) is the leading cloud vendor with a 32% market share with an annual growth of 41%.

According to the IDG survey, 89% of companies use some kind of SaaS services. In another study, Cisco was cited as stating that by 2021, approximately 75% of all cloud workloads and compute instances will be SaaS. The leading reason for SaaS adoption across enterprises are:

  • ability to work outside of the office (42%).
  • ease of disaster recovery (38%),
  • flexibility (37%),
  • offloading IT support (36%)

The leading reason for SaaS adoption across enterprises

The rate of SaaS adoption by enterprises has only accelerated because of the pandemic. At what rate, we’re yet to see. We should see a faster adoption pace as more enterprises move off of proprietary solutions to SaaS solutions simply because the old systems don’t provide the flexibility required by the new normal enforced by the COVID-19 pandemic.

SaaS benefits beyond productivity

SaaS is quickly becoming a reliable choice for private enterprises and government organizations. Enterprise executives are considering SaaS during this pandemic more than ever as they seek to adjust their organizations to a work from home culture and ensure staff efficiency.

It’s not only efficiency, though. According to the Rackspace study, from 1,300 surveyed companies, 88% of the enterprises with cloud services have experienced cost savings. Additionally, 56% reported an increase in profits.

Let’s look at a few other obvious benefits enterprises and agencies can gain from using SaaS solutions over the more traditional on-premise approach:

  • Reduction of the hardware cost

When using SaaS, there will be no need to maintain the existing server infrastructure on-premise. Cutting out the cost of hardware purchases and maintenance is especially important for fast-growing enterprises. New hardware can be bulky, expensive and demand special treatment and HVAC improvements. SaaS-based cloud solutions overcomes these issues. Furthermore, the cost of repairing and replacing hardware components is now passed on to the vendors to worry about.

  • Reduction of electricity and real estate expenses

The reduction of on-premise hardware will directly and positively affect the organization’s electricity and real estate expenses. Enterprises that adopt a SaaS solution for their ECM needs will free up real estate in their buildings, cut electricity costs to power that IT equipment and reduce their HVAC bills as there would be no need to climatize large rooms full of heat-producing equipment.

  • IT Support staff savings

With offloading the needed hardware concerns to the SaaS supplier, enterprises and agencies will no longer need a considerable workforce that maintains those systems. Routine maintenance, patching, hardware and software upgrades are all part of the SaaS offer, and this can be a considerable amount that companies save every month.

  • Reduce the time and cost for implementation and training

The deployment of a new cloud-based solution is considerably faster than any other conventional system implementation. While an on-premise solution would take months of work to set up the infrastructure, install the software and do internal training, a SaaS solution is set up in a few hours. Of course there will be exceptions to thissince some enterprises may need customization to an existing SaaS solution. But, all customization would be handled by a much larger IT team who can deploy a solution much faster. Most SaaS providers already have ready-made libraries of support materials like manuals or how-to videos, which makes training a self-service task for employees.

  •  Flexible cost of ownership

Software as a service-SaaS benefits

The typical SaaS pricing model is pay-as-you-go. Sometimes, there will be an initial setup fee if customizations are needed. After the setup, organizations will be looking at a relatively small fee each month, depending on the size of their workforce and features/resources they use. This pricing model is flexible, and it offers different ways for organizations to reduce overhead:

  1. First, the enterprise does not face a steep annual license With SaaS, there is no such thing as a software license. You rent the service and pay a small fee per user, per month.
  2. Second, this pay-as-you-go software can be canceled at any time. For example, if a pandemic happened. Or monthly fees can be reduced accordingly if and when the enterprise decides to reduce the workforce. The flip side, of course, is also the ease of scaling up the workforce. The only expense here would be the few minutes needed to create a new user profile for the employee, and they’d be ready to start using the software and the vendor-produced training materials.
  3. Third, enterprise SaaS solutions usually come with a utilization fee. For example, this means that enterprises won’t face fixed fees for data storage or computing power if they don’t use these services that much… This helps enterprises avoid fixed expenses of additional hardware that they would only use in peak periods. Cloud infrastructure providers, like AWS, are enabling SaaS providers to get the maximum value from server farms, and this benefit is clearly transferable to the end user, which in this case is the enterprise or organization using a SaaS solution.

This is clearly a very short list of extra benefits besides employee productivity and employee satisfaction improvements. As stated in the statistics above, enterprises and organizations are aware of the benefits of cloud-based solutions. This is why more and more of them are replacing proprietary and on-premise solutions with SaaS solutions. The pandemic only expedited this migration to the cloud as the work from home mode is now the only viable way for some organizations to continue operations.

The final step: SaaS-based Technology implementation

It used to be the case that the most prepared organizations in terms of software and hardware were the most resistant to cloud-based solutions. Their upfront investment made the cloud idea a bit redundant. But, times have changed. As Rick Holland, CISO and vice president of strategy at Digital Shadows, stated for Threatpost: “One of the unintended consequences of COVID-19 will likely be increased zero trust adoption that further embraces cloud services, eliminates VPNs, and enables employees to work from anywhere.”

Continuity of Operations must be provided. Enterprises will offload everything on the cloud: software, data storage, operations, processing power, users management, etc. Before the pandemic, Gartner estimates 50% of government organizations across the US were using cloud solutions. This number will rise even faster during a pandemic as SaaS solutions are already built with remote work in mind.

The crucial step of implementation of SaaS-based solutions is finding a reliable technology partner. This partner would ideally be a company that already has the know-how and routines in place to perform critical data migration and workflow creation processes using reliable technology… A company, which is recognized for its expertise, reliability, and ability to work under pressure.

Armedia LLC is a CMMI Level 3 company that provides a niche focus in Enterprise Content Management (ECM) technical and advisory services. We are a proven provider in delivering modern, flexible, robust, and scalable solutions to federal/state/local government, as well as commercial enterprises. With 18 years of deep ECM experience, our skilled team with industry certifications has helped deploy hundreds of ECM solutions. In just the past year, Armedia has actively supported over 20 initiatives in the government and commercial market.

For more info, don’t hesitate to contact us. We’d love to hear your thoughts on the topic, and we’d appreciate you sharing this blog post on your social media.