<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="http://www.stefanprobst.dev/feed.xml" rel="self" type="application/atom+xml" /><link href="http://www.stefanprobst.dev/" rel="alternate" type="text/html" /><updated>2025-04-16T11:33:52+00:00</updated><id>http://www.stefanprobst.dev/feed.xml</id><title type="html">Stefan Probst</title><subtitle>Senior Software Engineer</subtitle><author><name>Stefan Probst</name></author><entry><title type="html">Disabling WSDL Validation in Quarkus CXF</title><link href="http://www.stefanprobst.dev/quarkus-cxf-validation/" rel="alternate" type="text/html" title="Disabling WSDL Validation in Quarkus CXF" /><published>2025-04-16T00:00:00+00:00</published><updated>2025-04-16T00:00:00+00:00</updated><id>http://www.stefanprobst.dev/quarkus-cxf-validation</id><content type="html" xml:base="http://www.stefanprobst.dev/quarkus-cxf-validation/"><![CDATA[<p>When working with SOAP services using Quarkus CXF, you might encounter situations where the incoming messages don’t strictly adhere to the WSDL definition. This can happen when the API evolves and new fields are added without prior notification, leading to validation errors and preventing your application from processing these messages.</p>

<p>By default, CXF performs validation against the WSDL schema. While this is generally a good practice, it can become problematic in dynamic environments where the API contract might not always be perfectly aligned with the deployed WSDL.</p>

<p>This blog post demonstrates how to disable incoming message validation in your Quarkus CXF client to handle such scenarios gracefully.</p>

<h2 id="the-problem-strict-wsdl-validation">The Problem: Strict WSDL Validation</h2>

<p>Imagine your Quarkus application consumes a SOAP service defined by a WSDL. If the service provider adds a new, optional field to their response that isn’t described in the current WSDL your client has, CXF’s default validation will likely throw an exception, preventing you from processing the response.</p>

<p>While ideally, you would update your client’s WSDL and generated code whenever the service changes, this might not always be feasible or immediately possible. In such cases, temporarily disabling validation can provide a workaround.</p>

<h2 id="the-solution-a-custom-interceptor">The Solution: A Custom Interceptor</h2>

<p>We can create a custom CXF interceptor to disable the JAXB validation event handler, which is responsible for enforcing the schema compliance. Here’s the Java code for the interceptor:</p>

<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">import</span> <span class="nn">org.apache.cxf.message.Message</span><span class="o">;</span>
<span class="kn">import</span> <span class="nn">org.apache.cxf.phase.AbstractPhaseInterceptor</span><span class="o">;</span>
<span class="kn">import</span> <span class="nn">org.apache.cxf.phase.Phase</span><span class="o">;</span>
<span class="kn">import</span> <span class="nn">org.apache.cxf.interceptor.Fault</span><span class="o">;</span>

<span class="kd">public</span> <span class="kd">class</span> <span class="nc">DisableMessageValidationInterceptor</span> <span class="kd">extends</span> <span class="nc">AbstractPhaseInterceptor</span><span class="o">&lt;</span><span class="nc">Message</span><span class="o">&gt;</span>
<span class="o">{</span>
    <span class="kd">public</span> <span class="nf">DisableMessageValidationInterceptor</span><span class="o">()</span>
    <span class="o">{</span>
        <span class="kd">super</span><span class="o">(</span><span class="nc">Phase</span><span class="o">.</span><span class="na">PRE_PROTOCOL</span><span class="o">);</span>
    <span class="o">}</span>

    <span class="kd">public</span> <span class="kt">void</span> <span class="nf">handleMessage</span><span class="o">(</span><span class="nc">Message</span> <span class="n">message</span><span class="o">)</span> <span class="kd">throws</span> <span class="nc">Fault</span>
    <span class="o">{</span>
        <span class="n">message</span><span class="o">.</span><span class="na">put</span><span class="o">(</span><span class="s">"set-jaxb-validation-event-handler"</span><span class="o">,</span> <span class="s">"false"</span><span class="o">);</span>
    <span class="o">}</span>
<span class="o">}</span>
</code></pre></div></div>

<h2 id="explanation">Explanation:</h2>

<p>We create a class DisableMessageValidationInterceptor that extends AbstractPhaseInterceptor<Message>.
The constructor calls the superclass constructor with Phase.PRE_PROTOCOL. This ensures that our interceptor runs early in the interceptor chain, before the protocol-specific processing.
The handleMessage method is where the core logic resides.
We use message.put("set-jaxb-validation-event-handler", "false"); to instruct CXF to disable the JAXB validation event handler for the current message. This effectively tells the JAXB unmarshaller to ignore validation errors.</Message></p>

<h2 id="integrating-the-interceptor-in-quarkus">Integrating the Interceptor in Quarkus</h2>

<p>To use this interceptor with your Quarkus CXF client, you need to register it in your application.properties file. Assuming your CXF client is named “my-soap-client”, you would add the following line:</p>

<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">quarkus</span><span class="o">.</span><span class="na">cxf</span><span class="o">.</span><span class="na">client</span><span class="o">.</span><span class="s">"my-soap-client"</span><span class="o">.</span><span class="na">in</span><span class="o">-</span><span class="n">interceptors</span><span class="o">=</span><span class="n">com</span><span class="o">.</span><span class="na">example</span><span class="o">.</span><span class="na">DisableMessageValidationInterceptor</span>
</code></pre></div></div>

<h2 id="conclusion">Conclusion</h2>

<p>By implementing and registering this custom interceptor, you can effectively disable incoming message validation for your Quarkus CXF client. This can be a useful strategy when dealing with SOAP APIs that might evolve without strict adherence to the published WSDL.</p>

<p>However, it’s crucial to understand the implications of disabling validation:</p>

<ul>
  <li>You lose the guarantee that incoming messages strictly conform to the WSDL.</li>
  <li>Your application might need to be more resilient and handle unexpected data or missing fields.</li>
  <li>Disabling validation should be considered a temporary workaround. Ideally, you should strive to keep your client’s WSDL and generated code up-to-date with the service definition.</li>
</ul>

<p>Use this technique judiciously and ensure your application is prepared to handle potentially non-compliant messages.</p>]]></content><author><name>Stefan Probst</name></author><summary type="html"><![CDATA[When working with SOAP services using Quarkus CXF, you might encounter situations where the incoming messages don’t strictly adhere to the WSDL definition. This can happen when the API evolves and new fields are added without prior notification, leading to validation errors and preventing your application from processing these messages.]]></summary></entry><entry><title type="html">Terraform: `count` vs `for_each` - Why Named Indices Matter</title><link href="http://www.stefanprobst.dev/terraform-for-each-vs-count/" rel="alternate" type="text/html" title="Terraform: `count` vs `for_each` - Why Named Indices Matter" /><published>2024-10-01T00:00:00+00:00</published><updated>2024-10-01T00:00:00+00:00</updated><id>http://www.stefanprobst.dev/terraform-for-each-vs-count</id><content type="html" xml:base="http://www.stefanprobst.dev/terraform-for-each-vs-count/"><![CDATA[<p>When creating multiple instances of the same resource in Terraform, you have two primary meta-arguments at your disposal: count and for_each. While both serve the purpose of creating multiple resources, they differ in their approach and have distinct implications for resource management.</p>]]></content><author><name>Stefan Probst</name></author><summary type="html"><![CDATA[When creating multiple instances of the same resource in Terraform, you have two primary meta-arguments at your disposal: count and for_each. While both serve the purpose of creating multiple resources, they differ in their approach and have distinct implications for resource management.]]></summary></entry><entry><title type="html">Terraform and Azure Pipelines - Handling Complex Variables</title><link href="http://www.stefanprobst.dev/complex-terraform-azure-pipeline-parameters/" rel="alternate" type="text/html" title="Terraform and Azure Pipelines - Handling Complex Variables" /><published>2024-09-27T00:00:00+00:00</published><updated>2024-09-27T00:00:00+00:00</updated><id>http://www.stefanprobst.dev/complex-terraform-azure-pipeline-parameters</id><content type="html" xml:base="http://www.stefanprobst.dev/complex-terraform-azure-pipeline-parameters/"><![CDATA[<p>In a recent project, I created a monitoring action group using Terraform and aimed to configure and execute it within an Azure pipeline. The defined variable is a list of objects with the keys <code class="language-plaintext highlighter-rouge">name</code> and <code class="language-plaintext highlighter-rouge">email_address</code>. Terraform expects this variable as json input and this is where the difficulty began.</p>

<div class="language-terraform highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">variable</span> <span class="s2">"email_addresses"</span> <span class="p">{</span>
  <span class="nx">type</span> <span class="p">=</span> <span class="nx">list</span><span class="p">(</span><span class="nx">object</span><span class="p">({</span>
    <span class="nx">name</span>          <span class="p">=</span> <span class="nx">string</span>
    <span class="nx">email_address</span> <span class="p">=</span> <span class="nx">string</span>
  <span class="p">}))</span>
  <span class="nx">default</span> <span class="p">=</span> <span class="p">[]</span>
<span class="p">}</span>

<span class="k">resource</span> <span class="s2">"azurerm_monitor_action_group"</span> <span class="s2">"monitoring_action_group"</span> <span class="p">{</span>
  <span class="nx">name</span>                <span class="p">=</span> <span class="s2">"Application Monitoring"</span>
  <span class="nx">resource_group_name</span> <span class="p">=</span> <span class="nx">azurerm_resource_group</span><span class="p">.</span><span class="nx">resource_group</span><span class="p">.</span><span class="nx">name</span>
  <span class="nx">short_name</span>          <span class="p">=</span> <span class="s2">"EmailAlert"</span>

  <span class="nx">dynamic</span> <span class="s2">"email_receiver"</span> <span class="p">{</span>
    <span class="nx">for_each</span> <span class="p">=</span> <span class="kd">var</span><span class="p">.</span><span class="nx">email_addresses</span>
    <span class="nx">content</span> <span class="p">{</span>
      <span class="nx">name</span>          <span class="p">=</span> <span class="nx">email_receiver</span><span class="p">.</span><span class="nx">value</span><span class="p">[</span><span class="s2">"name"</span><span class="p">]</span>
      <span class="nx">email_address</span> <span class="p">=</span> <span class="nx">email_receiver</span><span class="p">.</span><span class="nx">value</span><span class="p">[</span><span class="s2">"email_address"</span><span class="p">]</span>
    <span class="p">}</span>
  <span class="p">}</span>
<span class="p">}</span>
</code></pre></div></div>

<p>Azure Pipelines allow parameters of type object, and therefore, complex types. To transform this into a usable JSON data structure, I am using an additional script step, in which the parameter is converted to JSON and written to a JSON file. The generated JSON file with the list of email receivers can then be used as a command option within the TerraformTaskV4@4.</p>

<div class="language-yml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="na">parameters</span><span class="pi">:</span>
  <span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">alertEmails</span>
    <span class="na">type</span><span class="pi">:</span> <span class="s">object</span>
    <span class="na">default</span><span class="pi">:</span> <span class="pi">[]</span>

<span class="na">steps</span><span class="pi">:</span>
  <span class="pi">-</span> <span class="na">script</span><span class="pi">:</span> <span class="pi">|</span>
      <span class="s">echo "{\"email_addresses\": $ALERT_EMAILS_JSON}" &gt; $(terraform_working_dir)/terraform.tfvars.json</span>
    <span class="na">env</span><span class="pi">:</span>
      <span class="na">ALERT_EMAILS_JSON</span><span class="pi">:</span> <span class="s">$\</span>
    <span class="na">displayName</span><span class="pi">:</span> <span class="s">Generate terraform.tfvars.json from parameters</span>
  <span class="s">...</span>
  <span class="pi">-</span> <span class="na">task</span><span class="pi">:</span> <span class="s">TerraformTaskV4@4</span>
    <span class="na">name</span><span class="pi">:</span> <span class="s">terraformPlan</span>
    <span class="na">displayName</span><span class="pi">:</span> <span class="s">Create Terraform Plan</span>
    <span class="na">inputs</span><span class="pi">:</span>
      <span class="na">provider</span><span class="pi">:</span> <span class="s">azurerm</span>
      <span class="na">command</span><span class="pi">:</span> <span class="s">plan</span>
      <span class="na">commandOptions</span><span class="pi">:</span> <span class="pi">&gt;-</span>
        <span class="s">-var environment=dev</span>
        <span class="s">-var-file="terraform.tfvars.json"</span>
      <span class="na">workingDirectory</span><span class="pi">:</span> <span class="s">$(terraform_working_dir)</span>
</code></pre></div></div>

<p>With this workaround, we can easily define a list of email receivers in our pipeline script.</p>

<div class="language-yml highlighter-rouge"><div class="highlight"><pre class="highlight"><code>      <span class="pi">-</span> <span class="na">template</span><span class="pi">:</span> <span class="s">terraform.yml</span>
        <span class="na">parameters</span><span class="pi">:</span>
          <span class="na">alertEmails</span><span class="pi">:</span>
            <span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s2">"</span><span class="s">Foo</span><span class="nv"> </span><span class="s">Bar"</span>
              <span class="na">email_address</span><span class="pi">:</span> <span class="s2">"</span><span class="s">foobar@development.de"</span>
            <span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s2">"</span><span class="s">John</span><span class="nv"> </span><span class="s">Due"</span>
              <span class="na">email_address</span><span class="pi">:</span> <span class="s2">"</span><span class="s">mail@john.com"</span>
</code></pre></div></div>]]></content><author><name>Stefan Probst</name></author><summary type="html"><![CDATA[In a recent project, I created a monitoring action group using Terraform and aimed to configure and execute it within an Azure pipeline. The defined variable is a list of objects with the keys name and email_address. Terraform expects this variable as json input and this is where the difficulty began.]]></summary></entry><entry><title type="html">Set up Azure Event Hub with log compaction in terraform</title><link href="http://www.stefanprobst.dev/azure-event-hub-log-compaction-terraform/" rel="alternate" type="text/html" title="Set up Azure Event Hub with log compaction in terraform" /><published>2024-04-11T00:00:00+00:00</published><updated>2024-04-11T00:00:00+00:00</updated><id>http://www.stefanprobst.dev/azure-event-hub-log-compaction-terraform</id><content type="html" xml:base="http://www.stefanprobst.dev/azure-event-hub-log-compaction-terraform/"><![CDATA[<p>Enabling log compaction via the Terraform resource <a href="https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/eventhub">azurerm_eventhub</a> is currently not feasible. Additionally, a <a href="https://github.com/hashicorp/terraform-provider-azurerm/issues/25563">bug</a> has been identified within azurerm_eventhub_authorization_rule. Attempting to create an authorization rule encounters an issue when log compaction is enabled.</p>

<p>That’s why I transitioned to <a href="https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/overview">Azure ARM templates</a> for deploying an Event Hub with both log compaction and authorization rules.</p>

<div class="language-tf highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">resource</span> <span class="s2">"azurerm_resource_group"</span> <span class="s2">"resource_group"</span> <span class="p">{</span>
  <span class="nx">name</span>     <span class="p">=</span> <span class="kd">var</span><span class="p">.</span><span class="nx">application_name</span>
  <span class="nx">location</span> <span class="p">=</span> <span class="kd">var</span><span class="p">.</span><span class="nx">location</span>
<span class="p">}</span>

<span class="k">resource</span> <span class="s2">"azurerm_resource_group_template_deployment"</span> <span class="s2">"eventhub"</span> <span class="p">{</span>
  <span class="nx">name</span>                <span class="p">=</span> <span class="s2">"eventhub-template"</span>
  <span class="nx">resource_group_name</span> <span class="p">=</span> <span class="nx">azurerm_resource_group</span><span class="p">.</span><span class="nx">resource_group</span><span class="p">.</span><span class="nx">name</span>
  <span class="nx">deployment_mode</span>     <span class="p">=</span> <span class="s2">"Incremental"</span>
  <span class="nx">template_content</span>    <span class="p">=</span> <span class="o">&lt;&lt;</span><span class="no">TEMPLATE</span><span class="sh">
{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "resources": [
    {
      "type": "Microsoft.EventHub/namespaces",
      "apiVersion": "2023-01-01-preview",
      "name": "${var.application_name}",
      "location": "${azurerm_resource_group.resource_group.location}",
      "sku": {
          "name": "Premium",
          "tier": "Premium"
      },
      "properties": {
        "zoneRedundant": true
      },
      "resources": [
        {
          "apiVersion": "2023-01-01-preview",
          "name": "${var.eventhub_name}",
          "type": "eventhubs",
          "dependsOn": [
              "Microsoft.EventHub/namespaces/${var.application_name}"
          ],
          "properties": {
            "partitionCount": "1",
            "retentionDescription": {
              "cleanupPolicy": "Compact",
              "tombstoneRetentionTimeInHours": 96
            }
          },
          "resources": [
            {
              "type": "authorizationRules",
              "apiVersion": "2023-01-01-preview",
              "name": "default",
              "dependsOn": [
                "[resourceId('Microsoft.EventHub/namespaces/eventhubs/', '${var.application_name}', '${var.eventhub_name}')]"
              ],
              "properties": {
                "rights": ["Send", "Listen"]
              }
            }
          ]
        }
      ]
    }
  ],
  "outputs": {
    "RootManageSharedAccessKeyConnectionString": {
      "type": "string",
      "value": "[listkeys(resourceId('Microsoft.EventHub/namespaces/AuthorizationRules', '${var.application_name}', 'RootManageSharedAccessKey'), '2017-04-01').primaryConnectionString]"
    },
    "defaultConnectionString": {
      "type": "string",
      "value": "[listkeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/AuthorizationRules', '${var.application_name}', '${var.eventhub_name}', 'default'), '2017-04-01').primaryConnectionString]"
    },
    "eventHubNamespaceId": {
      "type": "string",
      "value": "[resourceId('Microsoft.EventHub/namespaces', '${var.application_name}')]"
    }
  }
}
</span><span class="no">  TEMPLATE
</span><span class="p">}</span>
</code></pre></div></div>

<p>To access the defined output variables, you can utilize the jsondecode function.</p>

<div class="language-tf highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">output</span> <span class="s2">"authorization_rule_primary_connection_string"</span> <span class="p">{</span>
  <span class="nx">value</span>       <span class="p">=</span> <span class="nx">jsondecode</span><span class="p">(</span><span class="nx">azurerm_resource_group_template_deployment</span><span class="p">.</span><span class="nx">eventhub</span><span class="p">.</span><span class="nx">output_content</span><span class="p">).</span><span class="nx">defaultConnectionString</span><span class="p">.</span><span class="nx">value</span>
  <span class="nx">sensitive</span>   <span class="p">=</span> <span class="kc">true</span>
<span class="p">}</span>
</code></pre></div></div>]]></content><author><name>Stefan Probst</name></author><summary type="html"><![CDATA[Enabling log compaction via the Terraform resource azurerm_eventhub is currently not feasible. Additionally, a bug has been identified within azurerm_eventhub_authorization_rule. Attempting to create an authorization rule encounters an issue when log compaction is enabled.]]></summary></entry><entry><title type="html">Streamlining Azure Pipelines - Automating Avro Schema Publication to Event Hubs Schema Registry</title><link href="http://www.stefanprobst.dev/azure-pipeline-register-avro-schema-event-hub/" rel="alternate" type="text/html" title="Streamlining Azure Pipelines - Automating Avro Schema Publication to Event Hubs Schema Registry" /><published>2024-03-14T00:00:00+00:00</published><updated>2024-03-14T00:00:00+00:00</updated><id>http://www.stefanprobst.dev/azure-pipeline-register-avro-schema-event-hub</id><content type="html" xml:base="http://www.stefanprobst.dev/azure-pipeline-register-avro-schema-event-hub/"><![CDATA[<p>At present, the Azure CLI does not provide a means to upload schemas into the Event Hubs registry. Currently, the only options available are through the JavaScript SDK and the Azure Portal. I have been exploring a straightforward method to publish schemas seamlessly through a pipeline. This approach ensures that schemas are automatically versioned, released with each deployment, and mitigates the risk of oversight.</p>

<div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>- task: AzureCLI@2
  displayName: Register Avro Schemas
  inputs:
    azureSubscription: <span class="s2">"Service Connection"</span>
    scriptType: <span class="s2">"bash"</span>
    scriptLocation: <span class="s2">"inlineScript"</span>
    inlineScript: |
      <span class="nv">response</span><span class="o">=</span><span class="si">$(</span>az account get-access-token <span class="nt">--resource</span> https://&lt;Namespace_Name&gt;.servicebus.windows.net<span class="si">)</span>
        
      <span class="nv">token</span><span class="o">=</span><span class="s2">"Bearer </span><span class="sb">`</span><span class="nb">echo</span> <span class="nv">$response</span> | jq .<span class="s2">"accessToken"</span> | <span class="nb">tr</span> <span class="nt">-d</span> <span class="s1">'"'</span><span class="sb">`</span><span class="s2">"</span>

      <span class="nv">avro</span><span class="o">=</span><span class="s1">'{"namespace": "com.azure.schemaregistry.samples","type": "record","name": "Order","fields": [{"name": "id","type": "string"},{"name": "amount","type": "double"}]}'</span>
        
      curl <span class="nt">-X</span> PUT <span class="nt">-d</span> <span class="nv">$avro</span> <span class="nt">-H</span> <span class="s2">"Content-Type:application/json"</span> <span class="nt">-H</span> <span class="s2">"Authorization:</span><span class="nv">$token</span><span class="s2">"</span> <span class="nt">-H</span> <span class="s2">"Serialization-Type:Avro"</span> <span class="s1">'https://&lt;Namespace_Name&gt;.servicebus.windows.net/$schemagroups/&lt;SchemaGroup_Name&gt;/schemas/&lt;Schema_Name&gt;?api-version=2020-09-01-preview'</span>
</code></pre></div></div>

<p>Using the AzureCLI@2 Tasks along with a Service Connection, we have the capability to execute <code class="language-plaintext highlighter-rouge">az</code> commands and obtain an access token for the *.servicebus.windows.net resource. With this token, it becomes feasible to pump Avro files into the registry.</p>

<p>In a further step, I expanded the script to include the capability of reading Avro files from a directory.</p>

<div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nv">response</span><span class="o">=</span><span class="si">$(</span>az account get-access-token <span class="nt">--resource</span> https://&lt;Namespace_Name&gt;.servicebus.windows.net<span class="si">)</span>

<span class="nv">token</span><span class="o">=</span><span class="s2">"Bearer </span><span class="sb">`</span><span class="nb">echo</span> <span class="nv">$response</span> | jq .<span class="s2">"accessToken"</span> | <span class="nb">tr</span> <span class="nt">-d</span> <span class="s1">'"'</span><span class="sb">`</span><span class="s2">"</span>

<span class="k">for </span>file <span class="k">in</span> <span class="k">*</span>.avro<span class="p">;</span> <span class="k">do
  </span><span class="nv">avro_data</span><span class="o">=</span><span class="si">$(</span><span class="nb">cat</span> <span class="s2">"</span><span class="nv">$file</span><span class="s2">"</span><span class="si">)</span>     
  <span class="nv">schema_name</span><span class="o">=</span><span class="s2">"</span><span class="k">${</span><span class="nv">file</span><span class="p">%.*</span><span class="k">}</span><span class="s2">"</span>

  <span class="nb">echo</span> <span class="s2">"Publishing </span><span class="nv">$file</span><span class="s2"> to schema registry &lt;Namespace_Name&gt;"</span>
  <span class="nb">echo</span> <span class="s2">"Content: </span><span class="nv">$avro_data</span><span class="s2">"</span>

  curl <span class="nt">-X</span> PUT <span class="nt">-d</span> <span class="s2">"</span><span class="nv">$avro_data</span><span class="s2">"</span> <span class="nt">-H</span> <span class="s2">"Content-Type:application/json"</span> <span class="nt">-H</span> <span class="s2">"Authorization:</span><span class="nv">$token</span><span class="s2">"</span> <span class="nt">-H</span> <span class="s2">"Serialization-Type:Avro"</span> <span class="s2">"https://&lt;Namespace_Name&gt;.servicebus.windows.net/</span><span class="se">\$</span><span class="s2">schemagroups/&lt;SchemaGroup_Name&gt;/schemas/</span><span class="nv">$schema_name</span><span class="s2">?api-version=2020-09-01-preview"</span>
<span class="k">done</span>
</code></pre></div></div>]]></content><author><name>Stefan Probst</name></author><summary type="html"><![CDATA[At present, the Azure CLI does not provide a means to upload schemas into the Event Hubs registry. Currently, the only options available are through the JavaScript SDK and the Azure Portal. I have been exploring a straightforward method to publish schemas seamlessly through a pipeline. This approach ensures that schemas are automatically versioned, released with each deployment, and mitigates the risk of oversight.]]></summary></entry><entry><title type="html">Food Intolerances App Idea</title><link href="http://www.stefanprobst.dev/food-intolerances-app-idea/" rel="alternate" type="text/html" title="Food Intolerances App Idea" /><published>2020-06-21T00:00:00+00:00</published><updated>2020-06-21T00:00:00+00:00</updated><id>http://www.stefanprobst.dev/food-intolerances-app-idea</id><content type="html" xml:base="http://www.stefanprobst.dev/food-intolerances-app-idea/"><![CDATA[<p>Food intolerances, hypersensitivity to certain food or food allergies are on the rise throughout the entire world. Suffering from food intolerances myself, I came up with the idea of inventing an app, that would allow you to easily find out, which products were safe to eat and which ones to avoid.</p>

<p>There might already be apps out there, that let you know, which food groups generally to avoid. You might generally know, that you don’t feel well after eating strawberries or having a drink of dairy milk. But did you know that some of your favorite products might have traces of it? What does E322 even stand for? Standing in the supermarket, you usually wouldn’t suspect fructose in products that don’t state any fruit per se on the label. Whenever you go shopping it seems like a very lengthy process to study the list of ingredients on the label. It was very frustrating and I was, quite literally, fed up with having to deal with the constant stomach pain, because I accidentally ate something that I wasn’t allowed to consume. Not knowing, if you are sick after indulging on certain products, can be a burden to your social life and your family. Finally, I came up with the idea of inventing an app, that would do the work for me and warn me immediately before buying a pricy product that would end up in the bin. This seemed to be so much better healthwise and environmentally.</p>

<p>This app will help you as well, to source out all the food that you can’t eat and all the food that you can happily indulge on. With many processed foods out there, it’s always good to know what is safe to eat and what not. I am here to help you and make shopping easier and less time consuming and still healthy. I’m still in the early stages of development and love to get your feedback on this. Any intolerance that you are still missing? Please do let me know.</p>]]></content><author><name>Stefan Probst</name></author><summary type="html"><![CDATA[Food intolerances, hypersensitivity to certain food or food allergies are on the rise throughout the entire world. Suffering from food intolerances myself, I came up with the idea of inventing an app, that would allow you to easily find out, which products were safe to eat and which ones to avoid.]]></summary></entry><entry><title type="html">How I made €2,187.50 revenue with sausage stickers</title><link href="http://www.stefanprobst.dev/sausage-sticker/" rel="alternate" type="text/html" title="How I made €2,187.50 revenue with sausage stickers" /><published>2016-09-28T00:00:00+00:00</published><updated>2016-09-28T00:00:00+00:00</updated><id>http://www.stefanprobst.dev/sausage-sticker</id><content type="html" xml:base="http://www.stefanprobst.dev/sausage-sticker/"><![CDATA[<p>Welcome to the online butchery! Here you will find the best and freshest sausage stickers!</p>

<p>Are you sick of seeing the same hipster stickers everywhere? You attached countless stupid stickers of unknown bands? You finally want to demonstrate your love for salami and co to everyone around you? Then the sausage stickers are exactly what you need!</p>

<p>100% vegan, gluten-free and definitely zero calories!</p>

<p>–</p>

<h2 id="how-it-started">How it started</h2>
<p>The idea was born on my way home from the office. In Cologne, as in every major city, the streets are lined with nice and ugly stickers. After some very technical projects I was looking for something different which could build easily as MVP. So i got inspired by sticker art at the traffic lights. I don’t know anymore how I came up with the idea of sausage stickers but it immediately seemed very funny to me.</p>

<p>The same evening I bought the domains wurststicker.de and sausagesticker.com. On the next day I set up the site at Shopify, connected my account to the Stripe API and Paypal and uploaded some nice looking product mockups. I also wrote the product description in several minutes without too much hesitation and created some mockups of the very first products. The shop went live on Oct 02, 2015 after only two days from the initial idea.</p>

<p>Shopify is really nice if you need to set up a shop in a few hours and validate your product idea.</p>

<h2 id="the-first-sales">The first sales</h2>
<p>The really difficult part was to market the site. The biggest issue, as with every project, is to get attention. If there was a chance that people really like the idea of some crazy stickers I need them to visit the site. Buying ads was no option because it was not foreseeable how well the idea will turn out. The only way for me was to get featured on some blogs or news sites with similar topics. So I wrote a short email with a personal touch describing the product in simple words.</p>

<p>Sausage Sticker for your Notebook</p>

<blockquote>
  <p>Hello XY,
my name ist Stefan and I’m an software developer. Many of my friends are nerds. They stick their notebooks with the very same startup or band stickers. That was too boring for me, I wanted something crazy for my macbook.
For this reason I invented the sausage sticker.
Currently there is the sticker in the versions salami, mortadella, bacon and steak. Other sausages are already in the works and will follow in the near future.
Maybe the topic is interesting for your site.
Best regards
Stefan</p>
</blockquote>

<p>Unfortunately, I have not created a list of all the recipients I wrote to. But it must have been more than 200 emails. Most of them didn’t reply to me but some seemed to be really interested. One of the first Bloggers who published an article about my little project was Marc from testspiel.de. His site drove some traffic and brought the first sales. This was really impressing to me because I did not think that I could get the first sales in that short period of time and with that less effort.</p>

<h2 id="going-viral">Going Viral</h2>
<p>What happened then was really amazing. Marc’s article began his journey to the www. Many different sites also started to write articles about the sausage stickers and people on Facebook shared my site and informed their friends about the project. This peaked on Oct 4, 2015 with nearly 4,000 page views and an revenue of €298.00.</p>

<p>After that weekend my girlfriend and I had a lot of work to do. Printing invoices, putting stamps on the envelopes, writing thank you post it’s and putting everything together.</p>

<p>The absolute traffic peak occured the day that germanys biggest satire magazine put a small link on their page. Although the traffic was really impressive it did not lead to an increase in sales. This might be because people who are coming from a satire magazine are neither in the correct mindset nor do they believe in the seriousness of such an project.</p>

<h2 id="the-numbers">The Numbers</h2>
<p>Until today the project made an revenue of €2,187.50. From this amount the fees and the printing have to be deducted. I shipped the sausages to Germany, France, Austria, US, Australia and Island. The best selling product is the “Salami Sticker” which sold 140 times. More than 75% of the payments were made through Paypal.</p>

<h2 id="conclusion">Conclusion</h2>
<p>For me this project was an amazing journey and fun. I do not think that a sustainable business model can be developed from it without much effort. But it yielded some bucks and it motivated me to quickly develop and publish my ideas in the future.</p>]]></content><author><name>Stefan Probst</name></author><summary type="html"><![CDATA[Welcome to the online butchery! Here you will find the best and freshest sausage stickers!]]></summary></entry></feed>