2021-09-16
|~2 min read
|214 words
I wanted to test how my application would react if one of my endpoints was really slow.
While I could have modified the network settings in Chrome to throttle my speeds or introduce latency, I wanted the rest of the application to run as normal - this was particularly useful as the feature I wanted to test was nested fairly deeply in the app, so I didn’t want to have to wait on slow connections the whole way down and/or turn on/off the latency for each test.
Fortunately, I am using MirageJS to mock my endpoints, and there’s a very convenient options object that can be used to introduce latency. This is the third argument in a post
:
import { Response, Server } from "miragejs"
import { EightySixedItemAction } from "../../../src/store"
export function routesForAvailability(server: Server) {
server.post(
`expo/menu/availability`,
({ db }, request) => {
// handle request
},
{
timing: 5_000,
},
)
}
The documentation is here. This is slightly different from the server response time, which is also configurable.
Note, however, as far as I can tell there’s no way to deterministically vary the timing based on the contents of the request. Ah, well. C’est la vie. At least now I know how to modify the timing on the connection!
Hi there and thanks for reading! My name's Stephen. I live in Chicago with my wife, Kate, and dog, Finn. Want more? See about and get in touch!