Shahzad Bhatti Welcome to my ramblings and rants!

January 1, 2023

Consumer-driven and Producer-generated Contract Testing for REST APIs

Filed under: REST,Testing,Web Services — admin @ 9:43 pm

Though, REST standard for remote APIs is fairly loose but you can document API shape and structure using standards such as Open API and swagger specifications. The documented API specification ensures that both consumer/client and producer/server side abide by the specifications and prevent unexpected behavior. The API provider may also define service-level objective (SLO) so that API meets specified latency, security and availability and other service-level indicators (SLI). The API provider can use contract tests to validate the API interactions based on documented specifications. The contract testing includes both consumer and producer where a consumer makes an API request and the producer produces the result. The contract tests ensures that both consumer requests and producer responses match the contract request and response definitions per API specifications. These contract tests don’t just validate API schema instead they validate interactions between consumer and producer thus they can also be used to detect any breaking or backward incompatible changes so that consumers can continue using the APIs without any surprises.

In order to demonstrate contract testing, we will use api-mock-service library to generate mock/stub client requests and server responses based on Open API specifications or customized test contracts. These test contracts can be used by both consumers and producers for validating API contracts and evolve the contract tests as API specifications are updated.

Sample REST API Under Test

A sample eCommerce application will be used to demonstrate contracts testing. The application will use various REST APIs to implement online shopping experience. The primary purpose of this example is to show how different request structures can be passed to the REST APIs and then generate a valid result or an error condition for contract testing. You can view the Open-API specifications for this sample app here.

Customer REST APIs

The customer APIs define operations to manage customers who shop online, e.g.:

Customer APIs

Product REST APIs

The product APIs define operations to manage products that can be shopped online, e.g.:

Product APIs

Payment REST APIs

The payment APIs define operations to charge credit card and pay for online shopping, e.g.:

Payment APIs

Order REST APIs

The order APIs define operations to purchase a product from the online store and it will use above APIs to validate customers, check product inventory, charge payments and then store record of orders, e.g.:

Order APIs

Generating Stub Server Responses based on Open-API Specifications

In this example, stub server responses will be generated by api-mock-service based on open-api specifications ecommerce-api.json by starting the mock service first as follows:

Shell

And then uploading open-API specifications for ecommerce-api.json:

Shell

It will generate test contracts with stub/mock responses for all APIs defined in the ecommerce-api.json Open API specification. For example, you can produce result of customers REST APIs, e.g.:

Shell

to produce:

JSON

Above response is randomly generated based on the types/formats/regex/min-max limits of properties defined in Open-API and calling this API will automatically generate all valid and error responses, e.g. calling “curl http://localhost:8000/customers” again will return:

JSON

Consumer-driven Contract Testing

Upon uploading the Open-API specifications of microservices, the api-mock-service generates test contracts for each REST API and response statuses. You can then customize these test cases for consumer-driven contract testing.

For example, here is the default test contract generated for finding a customer by id with path “/customers/:id”:

YAML

Above template demonstrates interaction between consumer and producer by defining properties such as:

  • method – of REST API such as GET/POST/PUT/DELETE
  • name – of the test case
  • path of REST API
  • description – of test
  • predicate – defines a condition which must be true to select this test contract
  • request section defines input properties for the REST API including:
    • match_query_params – to match query input parameters for selecting the test contract
    • match_headers – to match input headers for selecting the test contract
    • match_contents – defines regex for selecting input body
    • path_params – defines path variables and regex
    • query_params and headers – defines sample input parameters and headers
  • response section defines output properties for the REST API including:
    • headers – defines response headers
    • contents – defines body of response
    • contents_file – allows loading response from a file
    • status_code – defines HTTP response status
  • wait_before_reply – defines wait time before returning response

You can then invoke test contract using:

Shell

that generates test case from the mock/stub server provided by the api-mock-service library, e.g.

JSON

You can customize above response contents using builtin template functions in the api-mock-service library or create additional test contracts for each distinct input parameter. For example, following contract defines interaction between consumer and producer to add a new customer:

YAML

Above template defines interaction for adding a new customer where request section defines format of request and matching criteria using match_content property. The response section includes the headers and contents that are generated by the stub/mock server for consumer-driven contract testing. You can then invoke test contract using:

Shell

Which will return a response such as:

JSON

Note: The response will not match the request body as the contract testing only tests interactions between consumer and producer without maintaining any server side state. You can use other types of testing such as integration/component/functional testing for validating state based behavior.

Producer-driven Generated Tests

The process of defining contracts to generate tests for validating producer REST APIs is similar to consumer-driven contracts. For example, you can upload open-api specifications or user-defined contracts to the api-mock-service provided mock/stub server.

For example, you can upload open-API specifications for ecommerce-api.json as follows:

Shell

Upon uploading the specifications, the mock server will generate contracts for each REST API and status. You can customize those contracts with additional validation or assertion and then invoke server generated tests either by specifying the REST API or invoke multiple REST APIs belonging to a specific group. You can also define an order for executing tests in a group and can optionally pass data from one invocation to the next invocation of REST API.

For testing purpose, we will customize customer REST APIs for adding a new customer and fetching a customer by its id, i.e.,

A contract for adding a new customer

YAML

The request section defines content property that will build the input request, which will be sent to the producer provided REST API. The server section defines match_contents to match regex of each response property. In addition, the response section defines assertions to compare against response contents, headers or status against expected output.

A contract for finding an existing customer

YAML

Above template defines similar properties to generate request body and defines match_contents with assertions to match expected output headers, body and status. Based on order of tests, the generated test to add new customer will be executed first, which will be followed by the test to find a customer by id. As we are testing against real REST APIs, the REST API path is defined as “/customers/{{.id}}” for finding a customer will populate the id from the output of first test based on the pipe_properties.

Uploading Contracts

Once you have the api-mock-service mock server running, you can upload contracts using:

Shell

You can start your service before invoking generated tests, e.g. we will use sample-openapi for the testing purpose and then invoke the generated tests using:

Shell

Above command will execute all tests for customers group and it will invoke each REST API 5 times. After executing the APIs, it will generate result as follows:

JSON

Though, generated tests are executed against real services, it’s recommended that the service implementation use test doubles or mock services for any dependent services as contract testing is not meant to replace component or end-to-end tests that provide better support for integration testing.

Recording Consumer/Producer interactions for Generating Stub Requests and Responses

The contract testing does not always depend on API specifications such as Open API and swagger and instead you can record interactions between consumers and producers using api-mock-service tool.

For example, if you have an existing REST API or a legacy service such as above sample API, you can record an interaction as follows:

Shell

This will invoke the remote REST API, record contract interactions and then return server response:

JSON

The recorded contract can be used to generate the stub response, e.g. following configuration defines the recorded contract:

YAML

You can then invoke consumer-driven contracts to generate stub response or invoke generated tests to test against producer implementation as described in earlier section. Another benefit of capturing test contracts using recorded session is that it can accurately capture all URLs, parameters and headers for both requests and responses so that contract testing can precisely validate against existing behavior.

Summary

Though, unit-testing, component testing and end-to-end testing are a common testing strategies that are used by most organizations but they don’t provide adequate support to validate API specifications and interactions between consumers/clients and producers/providers of the APIs. The contract testing ensures that consumers and producers will not deviate from the specifications and can be used to validate changes for backward compatibility when APIs are evolved. This also decouples consumers and producers if the API is still in development as both parties can write code against the agreed contracts and test them independently. A service owner can generate producer contracts using tools such as api-mock-service based on Open API specification or user-defined constraints. The consumers can provide their consumer-driven contracts to the service providers to ensure that the API changes don’t break any consumers. These contracts can be stored in a source code repository or on a registry service so that contract testing can easily access them and execute them as part of the build and deployment pipelines. The api-mock-service tool greatly assists in adding contract testing to your software development lifecycle and is freely available from https://github.com/bhatti/api-mock-service.

December 20, 2022

Property-based and Generative testing for Microservices

Filed under: REST,Technology,Testing — Tags: — admin @ 1:26 pm

The software development cycle for microservices generally include unit testing during the development where mock implementation for the dependent services are injected with the desired behavior to test various test-scenarios and failure conditions. However, the development teams often use real dependent services for integration testing of a microservice in a local environment. This poses a considerable challenge as each dependent service may be keeping its own state that makes it harder to reliably validate the regression behavior or simulate certain error response. Further, as the number of request parameters to the service or downstream services grow, the combinatorial explosion for test cases become unmanageable. This is where property-based testing offers a relief as it allows testing against automatically generated input fuzz-data, which is why this form of testing is also referred as a generative testing. A generator defines a function that generate random data based on type of input and constraints on the range of input values. The property-based test driver then iteratively calls the system under test to validate the result and assert the desired behavior, e.g.

Python

In above example, the input parameters are randomly generated based on a precondition. The generated parameters are passed to the function under test and the test driver validates result based on property assertions. This entire process is also referred as fuzzing, which is repeated based on a fixed range to identify any input parameters where the property assertions fail. There are a lot of libraries for property-based testing in various languages such as QuickCheck, fast-check, junit-quickcheck, ScalaCheck, etc. but we will use the api-mock-service library to demonstrate these capabilities for testing microservice APIs.

Following sections describe how the api-mock-service library can be used for testing microservice with fuzzing/property-based approaches and for mocking dependent services to produce the desired behavior:

Sample Microservices Under Test

A sample eCommerce application will be used to demonstrate property-based and generative testing. The application will use various microservices to implement online shopping experience. The primary purpose of this example is to show how different parameters can be passed to microservices, where microservice APIs will validate the input parameters, perform a simple business logic and then generate a valid result or an error condition. You can view the Open-API specifications for this sample app here.

Customer APIs

The customer APIs define operations to manage customers who shop online, e.g.:

Customer APIs

Product APIs

The product APIs define operations to manage products that can be shopped online, e.g.:

Product APIs

Payment APIs

The payment APIs define operations to charge credit card and pay for online shopping, e.g.:

Payment APIs

Order APIs

The order APIs define operations to purchase a product from the online store and it will use above APIs to validate customers, check product inventory, charge payments and then store record of orders, e.g.:

Order APIs

Defining Test Scenarios with Open-API Specifications

In this example, test scenarios will be generated by api-mock-service based on open-api specifications ecommerce-api.json by starting the mock service first as follows:

Shell

And then uploading open-API specifications for ecommerce-api.json:

Shell

It will generate mock APIs for all microservices, e.g. you can produce result of products APIs, e.g.:

Shell

to produce:

JSON

Above response is randomly generated based on the properties defined in Open-API and calling this API will automatically generate all valid and error responses, e.g. calling “curl http://localhost:8000/products” again will return:

JSON

Applying Property-based/Generative Testing for Clients of Microservices

Upon uploading the Open-API specifications of microservices, the api-mock-service automatically generated templates for producing mock responses and error conditions, which can be customized for property-based and generative testing of microservice clients by defining constraints for generating input/output data and assertions for request/response validation.

Client-side Testing for Listing Products

You can find generated mock scenarios for listing products on the mock service using:

Shell

which returns:

Shell

and then invoking above URL paths, e.g.

Shell

which will return randomly generated response such as:

YAML

We can customize above response contents using builtin template functions in the api-mock-service library to generate fuzz response, e.g.

YAML

In above example, we slightly improved the test template by generating product entries in a loop and using built-in functions to randomize the data. You can upload this scenario using:

Shell

You can also generate a template for returning an error response similarly, i.e.,

YAML

Invoking curl -v http://localhost:8000/products will randomly return both of those test scenarios so that client code can test for various conditions.

Client-side Testing for Creating Products

You can find mock scenarios for creating products that were generated from above Open-API specifications using:

Shell

You can then customize scenarios as follows and then upload it:

YAML

And then invoke above POST /products API using:

Shell

The client code can test for product properties and other error scenarios can be added to simulate failure conditions.

Applying Property-based/Generative Testing for Microservices

The api-mock-service test scenarios defined above can also be used to test against the microservice implementations. You can start your service, e.g. we will use sample-openapi for testing purpose and then invoke test request for server-side testing using:

Shell

Above command will submit request to execute all scenarios belonging to products group five times and then return:

JSON

You can also add custom assertions to validate the response in the save-product scenario:

YAML

If you try to run it again, the execution will fail with following error because none of the categories include X:

JSON

Summary

Using unit-testing and other forms of testing methodologies don’t rule out presence of the bugs but they can greatly reduce the probability of bugs. However, with large sized test suites, the maintenance of tests incur a high development cost especially if those tests are brittle that requires frequent changes. The property-based/generative testing can help fill in gaps in unit testing while keeping size of the tests suite small. The api-mock-service tool is designed to mock and test microservices using fuzzing and property-based testing techniques. This mocking library can be used to test both clients and server side implementation and can also be used to generate error conditions that are not easily reproducible. This library can be a powerful tool in your toolbox when developing distributed systems with a large number services, which can be difficult to deploy and test locally. You can read more about the api-mock-library at “Mocking and Fuzz Testing Distributed Micro Services with Record/Play, Templates and OpenAPI Specifications” and download it freely from https://github.com/bhatti/api-mock-service.

October 18, 2022

Mocking and Fuzz Testing Distributed Micro Services with Record/Play, Templates and OpenAPI Specifications

Filed under: GO,REST,Technology — admin @ 11:36 am

Building large distributed systems often requires integrating with multiple distributed micro-services that makes development a particularly difficult as it’s not always easy to deploy and test all dependent services in a local environment with constrained resources. In addition, you might be working on a large system with multiple teams where you may have received new API specs from another team but the API changes are not available yet. Though, you can use mocking frameworks based on API specs when writing a unit tests but integration or functional testing requires an access to the network service. A common solution that I have used in past projects is to configure a mock service that can simulate different API operations. I wrote a JVM based mock-service many years ago with following use-cases:

Use-Cases

  • As a service owner, I need to mock remote dependent service(s) by capturing/recording request/responses through an HTTP proxy so that I can play it back when testing the remote service(s) without connecting with them.
  • As a service owner, I need to mock remote dependent service(s) based on a open-api/swagger specifications so that my service client can test all service behavior per specifications for the remote service(s) even when remote service is not fully implemented or accessible.
  • As a service owner, I need to mock remote dependent service(s) based on a mock scenario defined in a template so that my service client can test service behavior per expected request/response in the template even when remote service is not fully implemented or accessible.
  • As a service owner, I need to inject various response behavior and faults to the output of a remote service so that I can build a robust client that prevents cascading failures and is more resilient to unexpected faults.
  • As a service owner, I need to define test cases with faulty or fuzz responses to test my own service so that I can predict how it will behave with various input data and assert the service response based on expected behavior.

Features

API mock service for REST/HTTP based services with following features:

  • Record API request/response by working as a HTTP proxy server (native http/https or via API) between client and remote service.
  • Playback API response that were previously recorded based on request parameters.
  • Define API behavior manually by specifying request parameters and response contents using static data or dynamic data based on GO templating language.
  • Generate API behavior from open standards such as Open API/Swagger and automatically create constraints and regex based on the specification.
  • Customize API behavior using a GO template language so that users can generate dynamic contents based on input parameters or other configuration.
  • Generate large responses using the template language with dynamic loops so that you can test performance of your system.
  • Define multiple test scenarios for the API based on different input parameters or simulating various error cases that are difficult to reproduce with real services.
  • Store API request/responses locally as files so that it’s easy to port stubbed request/responses to any machine.
  • Allow users to define API request/response with various formats such as XML/JSON/YAML and upload them to the mock service.
  • Support test fixtures that can be uploaded to the mock service and can be used to generate mock responses.
  • Define a collection of helper methods to generate different kind of random data such as UDID, dates, URI, Regex, text and numeric data.
  • Ability to playback all test scenarios or a specific scenario and change API behavior dynamically with different input parameters.
  • Support multiple mock scenarios for the same API that can be selected either using round-robin order, custom predicates based on parameters or based on scenario name.
  • Inject error conditions and artificial delays so that you can test how your system handles error conditions that are difficult to reproduce or use for game days/chaos testing.
  • Generate client requests for a remote API for chaos and stochastic testing where a set of requests are sent with a dynamic data generated based on regex or other constraints.

I used this service in many past projects, however I felt it needed a bit fresh approach to meet above goals so I rewrote it in GO language, which has a robust support for writing network services. You can download the new version from https://github.com/bhatti/api-mock-service. As, it’s written in GO, you can either download GO runtime environment or use Docker to install it locally. If you haven’t installed docker, you can download the community version from https://docs.docker.com/engine/installation/ or find installer for your OS on https://docs.docker.com/get-docker/.

Shell

or pull an image from docker hub (https://hub.docker.com/r/plexobject/api-mock-service), e.g.

Shell

Alternatively, you can run it locally with GO environment, e.g.,

Shell

For full command line options, execute api-mock-service -h that will show you command line options such as:

Shell

Recording a Mock Scenario via HTTP/HTTPS Proxy

Once you have the API mock service running, the mock service will start two ports on startup, first port (default 8080) will be used to record/play mock scenarios, updating templates or uploading OpenAPIs. The second port (default 8081) will setup an HTTP/HTTPS proxy server that you can point to record your scenarios, e.g.

Shell

Above curl command will automatically record all requests and responses and create mock scenario to play it back. For example, if you call the same API again, it will return a local response instead of contacting the server. You can customize the proxy behavior for record by adding X-Mock-Record: true header to your request.

Recording a Mock Scenario via API

Alternatively, you can use invoke an internal API as a pass through to invoke a remote API so that you can automatically record API behavior and play it back later, e.g.

Shell

In above example, the curl command is passing the URL of real service as an HTTP header Mock-Url. In addition, you can pass other authorization headers as needed.

Viewing the Recorded Mock Scenario

The API mock-service will store the request/response in a YAML file under a data directory that you can specify. For example, you may see a file under:

Shell

Note: the sensitive authentication or customer keys are masked in above example but you will see following contents in the captured data file:

YAML

Above example defines a mock scenario for testing /v1/customers/cus_**/cash_balance path. A test scenario includes:

Predicate

  • This is a boolean condition if you need to enable or disable a scenario test based on dynamic parameters or request count.

Group

  • This specifies the group for related test scenarios.

Request Matching Parameters:

The matching request parameters will be used to select the mock scenario to execute and you can use regular expressions to validate:

  • URL Query Parameters
  • URL Request Headers
  • Request Body

You can use these parameters so that test scenario is executed only when the parameters match, e.g.

YAML

The matching request parameters will be used to select the mock scenario to execute and you can use regular expressions to validate, e.g. above example will be matched if content-type is application/json and it will validate that name query parameter is alphanumeric from 1-50 size.

Example Request Parameters:

The example request parameters show the contents captured from the record/play so that you can use and customize to define matching parameters:

  • URL Query Parameters
  • URL Request Headers
  • Request Body

Response Properties

The response properties will include:

  • Response Headers
  • Response Body statically defined or loaded from a test fixture
  • Response can also be loaded from a test fixture file
  • Status Code
  • Matching header and contents
  • Assertions You can copy recorded scenario to another folder and use templates to customize it and then upload it for playback.

The matching header and contents use match_headers and match_contents similar to request to validate response in case you want to test response from a real service for chaos testing. Similarly, assertions defines a set of predicates to test against response from a real service:

YAML

Above example will check API response and verify that id property contains 10, title contains illo and result headers include Pragma: no-cache header.

Playback the Mock API Scenario

You can playback the recorded response from above example as follows:

Shell

Which will return captured response such as:

JSON

Though, you can customize your template with dynamic properties or conditional logic but you can also send HTTP headers for X-Mock-Response-Status to override HTTP status to return or X-Mock-Wait-Before-Reply to add artificial latency using duration syntax.

Debug Headers from Playback

The playback request will return mock-headers to indicate the selected mock scenario, path and request count, e.g.

HTML

Upload Mock API Scenario

You can customize above recorded scenario, e.g. you can add path variables to above API as follows:

YAML

In above example, I assigned a name stripe-cash-balance to the mock scenario and changed API path to /v1/customers/:customer/cash_balance so that it can capture customer-id as a path variable. I added a regular expression to ensure that the HTTP request includes an Authorization header matching Bearer sk_test_[0-9a-fA-F]{10}$ and defined dynamic properties such as {{.customer}}, {{.page}} and {{.pageSize}} so that they will be replaced at runtime.

The mock scenario uses builtin template syntax of GO. You can then upload it as follows:

Shell

and then play it back as follows:

Shell

and it will generate:

JSON

As you can see, the values of customer, page and pageSize are dynamically updated and the response header includes name of mock scenario with request counts. You can upload multiple mock scenarios for the same API and the mock API service will play it back sequentially. For example, you can upload another scenario for above API as follows:

YAML

And then play it back as before:

Shell

which will return response with following error response

Shell

Dynamic Templates with Mock API Scenarios

You can use loops and conditional primitives of template language and custom functions provided by the API mock library to generate dynamic responses as follows:

YAML

Above example includes a number of template primitives and custom functions to generate dynamic contents such as:

Loops

GO template support loops that can be used to generate multiple data entries in the response, e.g.

YAML

Builtin functions

GO template supports custom functions that you can add to your templates. The mock service includes a number of helper functions to generate random data such as:

Add numbers

YAML

Date/Time

YAML

Comparison

YAML

Enums

YAML

Random Data

YAML

Request count and Conditional Logic

YAML

The template syntax allows you to define a conditional logic such as:

YAML

In above example, the mock API will return HTTP status 500 or 501 for every 10th request and 200 or 400 for other requests. You can use conditional syntax to simulate different error status or customize response.

Loops

YAML

Variables

YAML

Test fixtures

The mock service allows you to upload a test fixture that you can refer in your template, e.g.

YAML

Above example loads a random line from a lines.txt fixture. As you may need to generate a deterministic random data in some cases, you can use Seeded functions to generate predictable data so that the service returns same data. Following example will read a text fixture to load a property from a file:

YAML

This template file will generate content as follows:

YAML

Artificial Delays

You can specify artificial delay for the API request as follows:

YAML

Above example shows delay based on page number but you can use any parameter to customize this behavior.

Conditional Logic

The template syntax allows you to define a conditional logic such as:

YAML

In above example, the mock API will return HTTP status 500 or 501 for every 10th request and 200 or 400 for other requests. You can use conditional syntax to simulate different error status or customize response.

Test fixtures

The mock service allows you to upload a test fixture that you can refer in your template, e.g.

JSON

Above example loads a random line from a lines.txt fixture. As you may need to generate a deterministic random data in some cases, you can use Seeded functions to generate predictable data so that the service returns same data. Following example will read a text fixture to load a property from a file:

JSON

This template file will generate content as follows:

JSON

Playing back a specific mock scenario

You can pass a header for X-Mock-Scenario to specify the name of scenario if you have multiple scenarios for the same API, e.g.

Shell

You can also customize response status by overriding the request header with X-Mock-Response-Status and delay before return by overriding X-Mock-Wait-Before-Reply header.

Using Test Fixtures

You can define a test data in your test fixtures and then upload as follows:

Shell

In above example, test fixtures for lines.txt and props.yaml will be uploaded and will be available for all GET requests under /devices URL path. You can then refer to above fixture in your templates. You can also use this to serve any binary files, e.g. you can define an image template file as follows:

YAML

Then upload a binary image using:

Shell

And then serve the image using:

Shell

Custom Functions

The API mock service defines following custom functions that can be used to generate test data:

Numeric Random Data

Following functions can be used to generate numeric data within a range or with a seed to always generate deterministic test data:

  • Random
  • SeededRandom
  • RandNumMinMax
  • RandIntArrayMinMax

Text Random Data

Following functions can be used to generate numeric data within a range or with a seed to always generate deterministic test data:

  • RandStringMinMax
  • RandStringArrayMinMax
  • RandRegex
  • RandEmail
  • RandPhone
  • RandDict
  • RandCity
  • RandName
  • RandParagraph
  • RandPhone
  • RandSentence
  • RandString
  • RandStringMinMax
  • RandWord

Email/Host/URL

  • RandURL
  • RandEmail
  • RandHost

Boolean

Following functions can be used to generate boolean data:

  • RandBool
  • SeededBool

UDID

Following functions can be used to generate UDIDs:

  • Udid
  • SeededUdid

String Enums

Following functions can be used to generate a string from a set of Enum values:

  • EnumString

Integer Enums

Following functions can be used to generate an integer from a set of Enum values:

  • EnumInt

Random Names

Following functions can be used to generate random names:

  • RandName
  • SeededName

City Names

Following functions can be used to generate random city names:

  • RandCity
  • SeededCity

Country Names or Codes

Following functions can be used to generate random country names or codes:

  • RandCountry
  • SeededCountry
  • RandCountryCode
  • SeededCountryCode

File Fixture

Following functions can be used to generate random data from a fixture file:

  • RandFileLine
  • SeededFileLine
  • FileProperty
  • JSONFileProperty
  • YAMLFileProperty

Generate Mock API Behavior from OpenAPI or Swagger Specifications

If you are using Open API or Swagger for API specifications, you can simply upload a YAML based API specification. For example, here is a sample Open API specification from Twilio:

YAML

You can then upload the API specification as:

Shell

It will generate a mock scenarios for each API based on mime-type, status-code, parameter formats, regex, data ranges, e.g.,

YAML

In above example, the account_sid uses regex to generate data and URI format to generate URL. Then invoke the mock API as:

Shell

Which will generate dynamic response as follows:

JSON

Listing all Mock Scenarios

You can list all available mock APIs using:

Shell

Which will return summary of APIs such as:

JSON

Chaos Testing

In addition to serving a mock service, you can also use a builtin chaos client to test remote services for stochastic testing by generating random data based on regex or API specifications. For example, you may capture a test scenario for a remote API using http proxy such as:

Shell

This will capture a mock scenario such as:

YAML

You can then customize this scenario with additional assertions and you may remove all response contents as they won’t be used. Note that above scenario is defined with group todos. You can then submit a request for chaos testing as follows:

Shell

Above request will submit 10 requests to the todo server with random data and return response such as:

YAML

If you have a local captured data, you can also run chaos client with a command line without running mock server, e.g.:

Shell

Static Assets

The mock service can serve any static assets from a user-defined folder and then serve it as follows:

Shell

API Reference

The API specification for the mock library defines details for managing mock scenarios and customizing the mocking behavior.

Summary

Building and testing distributed systems often requires deploying a deep stack of dependent services, which makes development hard on a local environment with limited resources. Ideally, you should be able to deploy and test entire stack without using network or requiring a remote access so that you can spend more time on building features instead of configuring your local environment. Above examples show how you use the https://github.com/bhatti/api-mock-service to mock APIs for testing purpose and define test scenarios for simulating both happy and error cases as well as injecting faults or network delays in your testing processes so that you can test for fault tolerance. This mock library can be used to define the API mock behavior using record/play, template language or API specification standards. I have found a great use of tools like this when developing micro services and hopefully you find it useful. Feel free to connect with your feedback or suggestions.

March 9, 2018

Separating Control/Config Management from the Services

Filed under: REST,Technology — admin @ 9:59 pm

One of pattern I learned earlier in my career was to separate control flow from the data flow. For example, when I first looked at FTP protocol, I noticed it listened to two separate ports for communication between client and server. It used a data port to transfer files and used a control port to send/receive commands for managing transfer. This allows the server to respond quickly if your data port is busy transferring large files. In some ways, this is similar to Bulkhead pattern for partitioning components and limiting the blast radius. When a service reaches its capacity, it will slow down all requests including any requests to control or configure it. Thus, it helps to define a separate channel where you can manage the control-service. Also, you may need to define special access-control policies to manage the control-service. For example, an admin may need to be on a trusted network for administration. In some cases, you may build a control-service for management behind the firewall but the data-service is publicly accessible. Another use-case is to update service’s configuration at runtime where you might store the service configuration via the control-service that can update the configuration and then publishes it to the data-service.

October 25, 2008

My gripes about REST services

Filed under: REST — admin @ 11:17 am

I love REST style services because of their simplicity and ease of testing. I have discussed benefits of REST earlier. I have used a number of distributed technologies over seventeen years such as LU6.2 (CICS) for mainframes, BSD Sockets, RPC, CORBA, RMI, JINI, Messaging Middlewares, SOAP, etc. In most of those technologies, you had to use special libraries to interact with the server. I worked in some organizations where I saw real dependency hell or jar hell, where I had to import dozens of client jars from different groups to talk to those services. REST, on the other hand only relies on HTTP (and security) libraries.

I first wrote XML over HTTP style service in late 90s before I knew about the REST term. At the time, I worked as a consultant for government DOTs and built CORBA based system to provide traffic data to media agencies. However, people were scraping our website for the data so I built an XML over HTTP service to download the data with some credentials. I think a lot of people used simplicity of HTTP to build similar services. And many of them didn’t understand REST as put forth by Roy Fielding. Over past few years, a lot of people are promoting real style of REST such as Sam Ruby, Stefan Tilkov, Steve Vinosky and Roy Fielding himself. At the same time, they are chastising people who diverge from their vision.

I have tried building REST style services over last few years and recently I have started building new services that will be used for entire organization. As, these services will need to support large number of transactions, performance and scalability are critical. Also, these services need to support batch of requests. In my older blog, I discussed commandments for writing service and wrote about importance of batch requests for scalability. That requirement changes a few things, for example instead of taking advantage of request parameters in HTTP, I had to use XML for input request and had to use XML for response. Though, this style is suitable for POST or PUT where you are expecting to read request as a file but is unnatural for GET types.

Another tenet of REST style services is resource. In this style, you interact with the service similar to how users interact with a web site, i.e., you click on links to go to another page, which returns more links and so on. However, this style in service adds network communication. For example, in one of the service for workflow I had to return active tasks for a particular workflow. In true REST style, I would have to return a hyperlink for each task instead of contents of task and the client would have to ask server for task content by hitting the resource for the task. As, you might guess this adds significant network latency so I am being practical instead of purest and as a result, I am returning task contents. Also, the service allows you to pass requests for multiple workflows and returns a single response for all workflows.

I used similar style when using messaging middlewares that allowed serving batches of requests in a single message. I found messaging middlewares offered more flexible options, e.g. you can design an aggregator service that waits for a few seconds for incoming requests and then puts them in a single request to another service that serve them. These kind of batch services are integral for building scalable systems. I have discussed some of these limitations in earlier blog, especially its lack of push based architecture. Though there are some workaround for push such as long-lived HTTP connection that are difficult to use behind firewall, comet style or reverse Ajax based communication, which is limited and more recently reverse http, which is too new to apply in real applications. Perhaps, messaging standards like XMPP can fill this gap.

I find that IT industry is largely driven by a single minded attitude where every new technology, language or process is adopted by bandwagon of people who just repeat each other’s words without understanding. For example, I have seen similar attitude from agile folks, folks from object oriented programming, aspect oriented programming, metaprogramming or dynamic programming, design patterns, etc. In the end, I believe you have to be practical instead of purest and as always answer to every question in IT is “it depends”, i.e., there is no single universal solution.

August 14, 2008

Benefits of REST based services

Filed under: REST — admin @ 10:47 pm

I saw Damien Katz blog on REST, I just don’t get it and I was a bit surprised that he doesn’t get REST especially since he wrote CouchDB based on REST. Though, I admit there are a lot of bad examples of REST services that use REST sort of like RPC over POST, but resource oriented services can be quite simple and powerful. REST principles are building blocks for the web and it has proven to be quite scalable and efficient. I have been developing REST based services for a number of years, in some ways before I learned about Roy Fieldings’ thesis and REST principles. Back in 90s, I worked on building traffic sites and used CORBA to subscribe and publish traffic events. We also published that data on the website, but soon we found a number of people were scraping the website so I wrote a simple XML over HTTP service to download the data that other groups can use. I have found following benefits when using REST based services:

  • separating reads from writes. I have worked on large ecommerce and travel website and one of the lesson is to keep your read/query services separate from your transactional services. REST APIs define separate operations for reads and write/updates.
  • caching: you can find tons of off the shelf solutions for caching GET requests including hardware solutions. There are tons of features like ETags and cache headers that provide this feature.
  • compression: Since REST uses HTTP, you can use compression such as gzip. This can improve the performance of the services.
  • idempotency: GET, PUT, DELETE and HEAD are idempotent, which means if designed correctly the request can be retried without any worries about side effects. POST on the other hand is not idempotent and may have side effects.
  • bookmarking: GET requests can be easily bookmarked. It is important not to use GET to change state of application.
  • security: Though, security has been weakest area of REST as compared to SOAP, but HTTPS and simple authentication surfice. Though, there are better standards like oauth.
  • big response size: REST/HTTP is the only service platform that I have seen supports gigabytes of responses. I have done a lot of CORBA based services in 90s, EJBs/SOAP in early 2000s and messaging based services for over ten years. None of those platforms support large size responses.
  • simplicity: I find this is the biggest reason for using REST. I can use browser to call GET based requests and write client in any language.
  • resources: REST response can include URIs for other APIs and client can change state through these resources. You can use XHTML to embed all these resources that can be easily tested with browsers.
  • No need for additional jars: When I used CORBA, EJBs, RMI or JINI, I always had to put client/skeleton jar files. Having worked in large companies where I had to import dozens of these jar files became maintenance problem. With REST, I can simply call the service without importing anything.
  • Error codes: HTTP comes with a number of response codes for real life services including thrashing requests such as server busy (503).
  • Meta data: As opposed to CORBA, JINI, RMI services, I can pass meta data easily as HTTP supports predefined and user-defined headers. These headers can include information on authentication, quality of service, timeout or other context related data. Occasionally, I add Map<String,String> to APIs when I use Java based services, but it polutes pure APIs.

The only real drawbacks I see for REST based services are that they are generally synchronous and blocking which can waste threads (though some of it can be solved with async I/O or event based dispatching). Personally I like to use messaging underneath REST services that provide asynchronous, persistence, and better reliability.

Powered by WordPress