{"id":17,"date":"2025-06-20T05:20:19","date_gmt":"2025-06-20T05:20:19","guid":{"rendered":"https:\/\/dataopsschool.com\/blog\/?p=17"},"modified":"2025-08-06T10:30:12","modified_gmt":"2025-08-06T10:30:12","slug":"dataops-in-the-context-of-devsecops","status":"publish","type":"post","link":"https:\/\/dataopsschool.com\/blog\/dataops-in-the-context-of-devsecops\/","title":{"rendered":"DataOps in the Context of DevSecOps"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\"><strong>1. Introduction &amp; Overview<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">What is DataOps?<\/h3>\n\n\n\n<p><strong>DataOps<\/strong> is a collaborative data management practice that applies Agile, DevOps, and lean manufacturing principles to the end-to-end data lifecycle. Its goal is to improve the speed, quality, and security of data analytics by fostering better communication, automation, and governance between data engineers, scientists, analysts, and operations teams.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/www.dataops.live\/hs-fs\/hubfs\/Infinity%20diagram.png?width=3322&amp;height=1020&amp;name=Infinity%20diagram.png\" alt=\"\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">History or Background<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>2014<\/strong>: Term &#8220;DataOps&#8221; introduced by Lenny Liebmann at IBM Big Data Hub.<\/li>\n\n\n\n<li><strong>2017<\/strong>: Andy Palmer (Tamr) helped popularize it further.<\/li>\n\n\n\n<li><strong>2020+<\/strong>: Tools like <strong>Apache NiFi<\/strong>, <strong>Airflow<\/strong>, <strong>Dagster<\/strong>, and <strong>KubeFlow<\/strong> started integrating DataOps concepts.<\/li>\n\n\n\n<li><strong>2023\u20132025<\/strong>: Widespread enterprise adoption across <strong>Finance, Healthcare, Retail<\/strong>, and <strong>Security<\/strong>.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Why is it Relevant in DevSecOps?<\/h3>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Relevance in DevSecOps<\/th><th>Description<\/th><\/tr><\/thead><tbody><tr><td>\ud83d\udd04 Continuous Data Integration<\/td><td>Syncs secure data with CI\/CD pipelines and analytics workflows<\/td><\/tr><tr><td>\ud83d\udd0d Real-Time Security Analysis<\/td><td>Feeds logs, events, and telemetry data to security analytics systems<\/td><\/tr><tr><td>\u2705 Compliance &amp; Auditing<\/td><td>Ensures PII\/GDPR\/HIPAA compliance in pipelines using policy-as-code<\/td><\/tr><tr><td>\u2699\ufe0f Automation of Data Checks<\/td><td>Integrates automated testing for data quality, schema drift, and lineage<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>2. Core Concepts &amp; Terminology<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Key Terms and Definitions<\/h3>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Term<\/th><th>Definition<\/th><\/tr><\/thead><tbody><tr><td><strong>Data Pipeline<\/strong><\/td><td>An automated sequence of steps to move, clean, and transform data.<\/td><\/tr><tr><td><strong>Orchestration<\/strong><\/td><td>Coordination of tasks (e.g., Apache Airflow for DAG-based orchestration).<\/td><\/tr><tr><td><strong>Data Observability<\/strong><\/td><td>Monitoring data for quality, lineage, freshness, and anomalies.<\/td><\/tr><tr><td><strong>Data Lineage<\/strong><\/td><td>Track how data moves and transforms across systems.<\/td><\/tr><tr><td><strong>DataOps Toolchain<\/strong><\/td><td>Tools used for ingestion, transformation, observability, versioning, etc.<\/td><\/tr><tr><td><strong>Policy-as-Code<\/strong><\/td><td>Security\/compliance rules embedded in the pipeline via code.<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">How It Fits into the DevSecOps Lifecycle<\/h3>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>DevSecOps Stage<\/th><th>DataOps Integration<\/th><\/tr><\/thead><tbody><tr><td><strong>Plan<\/strong><\/td><td>Define data models, privacy policies, and risk assessments.<\/td><\/tr><tr><td><strong>Develop<\/strong><\/td><td>Use version control for data pipelines and transformations.<\/td><\/tr><tr><td><strong>Build<\/strong><\/td><td>Integrate tests for data quality and schema validation.<\/td><\/tr><tr><td><strong>Test<\/strong><\/td><td>Automate security, compliance, and unit testing of data flows.<\/td><\/tr><tr><td><strong>Release<\/strong><\/td><td>Use CI\/CD to deploy pipelines with audit trails.<\/td><\/tr><tr><td><strong>Operate<\/strong><\/td><td>Monitor data SLAs, errors, and lineage.<\/td><\/tr><tr><td><strong>Monitor<\/strong><\/td><td>Trigger alerts on anomalies, unauthorized access, or breaches.<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<pre class=\"wp-block-code\"><code>DevSecOps Pipeline:\n\n&#091;Code Commit] --&gt; &#091;CI\/CD] --&gt; &#091;Test + Scan] --&gt; &#091;Deploy] --&gt; &#091;Monitor] --&gt; &#091;Audit]\n\n              \\----&gt; &#091;DataOps: Real-time data, logs, metrics feed into Security &amp; Monitoring]\n<\/code><\/pre>\n\n\n\n<p>DataOps complements DevSecOps by continuously managing <strong>secure data flows<\/strong> and analytics pipelines through automation, security checks, and observability.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>3. Architecture &amp; How It Works<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Components of a DataOps Architecture<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Data Sources<\/strong>: Databases, APIs, IoT, logs, etc.<\/li>\n\n\n\n<li><strong>Ingestion Layer<\/strong>: Tools like Apache NiFi, Kafka, or Fivetran.<\/li>\n\n\n\n<li><strong>Storage &amp; Lakehouse<\/strong>: AWS S3, Google BigQuery, Snowflake, Delta Lake.<\/li>\n\n\n\n<li><strong>Transformation Layer<\/strong>: dbt, Apache Spark, Airflow.<\/li>\n\n\n\n<li><strong>Testing &amp; Validation<\/strong>: Great Expectations, Soda Core.<\/li>\n\n\n\n<li><strong>Orchestration<\/strong>: Apache Airflow, Prefect, Dagster.<\/li>\n\n\n\n<li><strong>CI\/CD Integration<\/strong>: GitHub Actions, GitLab CI, Jenkins.<\/li>\n\n\n\n<li><strong>Monitoring &amp; Observability<\/strong>: Monte Carlo, Databand, Prometheus.<\/li>\n\n\n\n<li><strong>Security &amp; Compliance<\/strong>: Vault, Ranger, IAM policies, encryption.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Layer<\/th><th>Tools \/ Tech Examples<\/th><\/tr><\/thead><tbody><tr><td><strong>Ingestion<\/strong><\/td><td>Apache Kafka, Logstash, NiFi<\/td><\/tr><tr><td><strong>Transformation<\/strong><\/td><td>dbt, Apache Beam, Spark, Python scripts<\/td><\/tr><tr><td><strong>Storage<\/strong><\/td><td>AWS S3, HDFS, Snowflake, Data Lakes<\/td><\/tr><tr><td><strong>Orchestration<\/strong><\/td><td>Apache Airflow, Dagster, Prefect<\/td><\/tr><tr><td><strong>Monitoring<\/strong><\/td><td>Monte Carlo, Databand, Prometheus + Grafana<\/td><\/tr><tr><td><strong>Governance<\/strong><\/td><td>Apache Atlas, Collibra, Amundsen<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/assets.qlik.com\/image\/upload\/w_1800\/q_auto\/qlik\/glossary\/dataops\/seo-dataops-how-it-works_babonv.png\" alt=\"\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Internal Workflow<\/h3>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Code\/Data commit triggers pipeline.<\/li>\n\n\n\n<li>CI\/CD tools test and validate transformations.<\/li>\n\n\n\n<li>Pipelines deploy to staging \u2192 production.<\/li>\n\n\n\n<li>Monitoring agents track data quality and performance.<\/li>\n\n\n\n<li>Alerts\/logs integrated into SIEM or DevSecOps dashboards.<\/li>\n<\/ol>\n\n\n\n<pre class=\"wp-block-code\"><code>1. Ingest raw data \u279c 2. Clean &amp; validate \u279c 3. Transform &amp; enrich \u279c\n4. Load into secure storage \u279c 5. Monitor metrics &amp; anomalies \u279c\n6. Audit logs + Notify via CI\/CD\/Slack\/Jira<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Architecture Diagram (Description)<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code>&#091;Source Systems] \n     \u2193\n&#091;Ingestion (Kafka\/NiFi)] \n     \u2193\n&#091;Storage (S3\/Snowflake)] \u2190\u2192 &#091;Security (IAM\/Vault)]\n     \u2193\n&#091;Transformation (dbt\/Spark)] \u2190\u2192 &#091;Testing (Great Expectations)]\n     \u2193\n&#091;Orchestration (Airflow)] \n     \u2193\n&#091;Monitoring (Prometheus, Monte Carlo)] \n     \u2193\n&#091;Dashboards + Alerts \u2192 SIEM tools \/ DevSecOps Observability]\n<\/code><\/pre>\n\n\n\n<pre class=\"wp-block-code\"><code>&#091;Sources] --&gt; &#091;Ingest Layer: Kafka\/NiFi] --&gt; &#091;Processing: dbt\/Spark] --&gt; \n&#091;Orchestrator: Airflow] --&gt; &#091;Data Lake or DW] --&gt; &#091;Monitoring + Alerts] \n      |\n    &#091;Security &amp; Compliance: Policy-as-Code, Logging, Access Control]<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Integration Points with CI\/CD and Cloud Tools<\/h3>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Integration<\/th><th>Tool<\/th><th>Purpose<\/th><\/tr><\/thead><tbody><tr><td>GitOps<\/td><td>GitHub Actions, GitLab CI<\/td><td>Versioned, auditable data workflows<\/td><\/tr><tr><td>Secrets Mgmt<\/td><td>HashiCorp Vault<\/td><td>Secure API keys and credentials<\/td><\/tr><tr><td>Cloud<\/td><td>AWS\/GCP\/Azure<\/td><td>Scalable, serverless data ops<\/td><\/tr><tr><td>Containerization<\/td><td>Docker, Kubernetes<\/td><td>Deploy pipelines as microservices<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Integration Point<\/th><th>Examples<\/th><\/tr><\/thead><tbody><tr><td>CI\/CD Trigger<\/td><td>Jenkins\/GitHub Actions kicks data pipeline<\/td><\/tr><tr><td>Containerization<\/td><td>Dockerized Spark\/Airflow on Kubernetes<\/td><\/tr><tr><td>Cloud Services<\/td><td>AWS Glue, Azure Data Factory, GCP Dataflow<\/td><\/tr><tr><td>Secrets Management<\/td><td>HashiCorp Vault, AWS Secrets Manager<\/td><\/tr><tr><td>Security Scanning<\/td><td>Great Expectations, Datafold, Soda<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>4. Installation &amp; Getting Started<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Basic Setup &amp; Prerequisites<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Git, Python 3.x, Docker<\/li>\n\n\n\n<li>Cloud access (AWS\/GCP preferred)<\/li>\n\n\n\n<li>DataOps stack tools (e.g., dbt, Airflow, Great Expectations)<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Step-by-Step: DataOps with Airflow + dbt + Great Expectations<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code>Step 1: Clone Repo\nbash\nCopy\nEdit\ngit clone https:\/\/github.com\/example\/dataops-demo.git\ncd dataops-demo\n\nStep 2: Start Airflow with Docker\nbash\nCopy\nEdit\ndocker-compose up -d\n\nStep 3: Initialize Airflow Database\nbash\nCopy\nEdit\ndocker-compose exec airflow-webserver airflow db init\n\nStep 4: Access UI\nGo to http:\/\/localhost:8080\nLogin: admin \/ admin\n\nStep 5: Set Up Your dbt Project\nbash\nCopy\nEdit\npip install dbt-core\ndbt init my_project\n<\/code><\/pre>\n\n\n\n<p>Now you have a functional pipeline: Airflow orchestrates your dbt models!<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>5. Real-World Use Cases<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">\u2705 Use Case 1: Continuous Security Data Ingestion<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Ingest threat logs from multiple tools (e.g., Falco, CrowdStrike)<\/li>\n\n\n\n<li>Transform &amp; analyze with Spark<\/li>\n\n\n\n<li>Alert via Airflow DAG on anomaly detection<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">\u2705 Use Case 2: GDPR Compliance Pipeline<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Scan data using Great Expectations for PII<\/li>\n\n\n\n<li>Route violations to Splunk or Jira for compliance officers<\/li>\n\n\n\n<li>Record lineage using Apache Atlas<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">\u2705 Use Case 3: Automated Model Monitoring in FinTech<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Data flows from real-time trading system<\/li>\n\n\n\n<li>Validated daily by Monte Carlo<\/li>\n\n\n\n<li>Alerts if model drift or schema changes are detected<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">\u2705 Use Case 4: Retail Inventory Forecasting<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Data from 50 stores ingested nightly<\/li>\n\n\n\n<li>dbt transforms it into sales + inventory dashboards<\/li>\n\n\n\n<li>Slack alerts sent for threshold breaches<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>6. Benefits &amp; Limitations<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Key Advantages<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>\u23f1\ufe0f Faster delivery of data products<\/li>\n\n\n\n<li>\ud83d\udd10 Embedded security &amp; compliance<\/li>\n\n\n\n<li>\ud83d\udd0d Observability and quality checks<\/li>\n\n\n\n<li>\ud83d\udd04 Integration with DevOps toolchains<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Common Challenges<\/h3>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Challenge<\/th><th>Notes<\/th><\/tr><\/thead><tbody><tr><td>\ud83d\udd0d Tool Sprawl<\/td><td>Too many tools can complicate management<\/td><\/tr><tr><td>\ud83e\udde0 Skill Gap<\/td><td>Requires knowledge in both DevOps and Data Engineering<\/td><\/tr><tr><td>\ud83d\udd12 Data Security Complexity<\/td><td>Securing pipelines across cloud platforms can be difficult<\/td><\/tr><tr><td>\ud83d\udd04 Testing Complexity<\/td><td>Difficult to version\/test data transformations like software<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>7. Best Practices &amp; Recommendations<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">\ud83d\udd10 Security, Maintenance, and Compliance<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use <strong>encryption in transit and at rest<\/strong><\/li>\n\n\n\n<li>Integrate with <strong>policy-as-code frameworks<\/strong> (e.g., OPA)<\/li>\n\n\n\n<li>Automate <strong>data quality checks<\/strong> via Great Expectations<\/li>\n\n\n\n<li>Rotate secrets using <strong>Vault or cloud-native managers<\/strong><\/li>\n\n\n\n<li>Store lineage in Apache Atlas or Marquez<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">\u2699\ufe0f Performance &amp; Automation Tips<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Run batch jobs on <strong>auto-scaling clusters<\/strong><\/li>\n\n\n\n<li>Use <strong>GitOps<\/strong> to version-control pipeline configs<\/li>\n\n\n\n<li>Monitor with <strong>Grafana dashboards<\/strong><\/li>\n\n\n\n<li>Use <strong>CI\/CD<\/strong> to auto-deploy dbt or Airflow DAG changes<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>8. Comparison with Alternatives<\/strong><\/h2>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Feature<\/th><th>DataOps (Airflow + dbt)<\/th><th>Traditional ETL Tools<\/th><th>ML Ops<\/th><\/tr><\/thead><tbody><tr><td>Automation<\/td><td>\u2705 High<\/td><td>\u274c Low<\/td><td>\u2705 Medium<\/td><\/tr><tr><td>Version Control<\/td><td>\u2705 Git-Based<\/td><td>\u274c Manual<\/td><td>\u2705 Git<\/td><\/tr><tr><td>Security &amp; Compliance<\/td><td>\u2705 Integrated<\/td><td>\u274c Minimal<\/td><td>\u2705 Integrated<\/td><\/tr><tr><td>CI\/CD Integration<\/td><td>\u2705 Strong<\/td><td>\u274c Weak<\/td><td>\u2705 Medium<\/td><\/tr><tr><td>Data Lineage<\/td><td>\u2705 Native Support<\/td><td>\u274c Rare<\/td><td>\u2705 Medium<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">When to Choose DataOps<\/h3>\n\n\n\n<p>Choose DataOps if:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>You need <strong>real-time secure data flows<\/strong><\/li>\n\n\n\n<li>You&#8217;re working in a <strong>DevSecOps or regulated<\/strong> environment<\/li>\n\n\n\n<li>You want <strong>CI\/CD-style delivery for data pipelines<\/strong><\/li>\n\n\n\n<li>Your teams include <strong>DevOps + Data + Security engineers<\/strong><\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>9. Conclusion<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Final Thoughts<\/h3>\n\n\n\n<p>DataOps is no longer optional \u2014 it\u2019s <strong>foundational<\/strong> in DevSecOps pipelines where secure, fast, and auditable data handling is critical. It merges automation, observability, and compliance with modern data engineering.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>The future of DevSecOps is <strong>data-aware<\/strong> and <strong>AI-augmented<\/strong>, and DataOps is the enabler.<\/p>\n<\/blockquote>\n\n\n\n<h3 class=\"wp-block-heading\">Future Trends<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Rise of <strong>Data Contracts<\/strong> for API-level data governance<\/li>\n\n\n\n<li>Integration with <strong>AI Observability<\/strong> tools<\/li>\n\n\n\n<li>Fully <strong>serverless DataOps<\/strong> platforms<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>1. Introduction &amp; Overview What is DataOps? DataOps is a collaborative data management practice that applies Agile, DevOps, and lean manufacturing principles to the end-to-end data lifecycle&#8230;. <\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-17","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/posts\/17","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/comments?post=17"}],"version-history":[{"count":3,"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/posts\/17\/revisions"}],"predecessor-version":[{"id":363,"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/posts\/17\/revisions\/363"}],"wp:attachment":[{"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/media?parent=17"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/categories?post=17"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dataopsschool.com\/blog\/wp-json\/wp\/v2\/tags?post=17"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}