Skip to main content

Pallet Level Push

Overview

The pallet level push interface is designed to send a message for every pallet that passes through the Kargo system. In this document we'll describe the details of that message payload along with the forms of consuming this data. Very simply the Kargo system has the ability to produce a message once it has finished processing a pallet that has gone through the Kargo towers or lift camera. This message is customizable so that any field which can be read off a label can be sent. Because sometimes there are multiples of a field on a label all these values come in lists.

Payload

The payload itself is just a json object. There will be some fields which are always present, and some fields that are customer specific.

FieldDescription
businessSlugBusiness slug as defined by Kargo.
facilitySlugFacility slug as defined by Kargo. Can be used in conjunction with business slug to identify the facility at which this pallet was observed.
kargoShipmentIdThe ID of the shipment in the Kargo platform.
kargoPalletIdThe ID of the pallet in the Kargo platform.
occurredAtThe UTC timestamp in ISO 8601 format for when the pallet passed a Kargo tower.
urlURL linking to the pallet in the Kargo dashboard.
directionDirection of the pallet: LOADING or UNLOADING. If a pallet gets both loaded and unloaded, you’ll receive two events—your consumer logic should handle canceling out the load/unload pair.
ordersList of orders to which this pallet belongs.
dockIdDock at which the pallet was observed.

As for the customer specific fields, during the integration process we will outline the specific fields you're interested in and we can assign any name that works for you for those fields. Below is an example of some customization

{
"businessSlug": "business",
"facilitySlug": "facility",
"kargoShipmentId": "1562067",
"kargoPalletId": "121159700",
"occurredAt": "2025-05-28T04:59:44.583Z",
"url": "https://athena.mykargo.com/shipments/1562067/media?type=loading&activity=1211597&pallet=121159700",
"direction": "LOADING",
"orders": ["12345678"],
"dockId": "1",

"LPN": "111000678920",
"SKUs": ["SKU-65000"] // If multiple of a field are expected
"ExpirationDate": "2025-06-01",
"LotNumber": "01123"
}
note

The custom label fields that are pushed (such as LPN, SKUs, ExpirationDate, LotNumber, etc.) are configured by Kargo based on the specific label fields and any other required data for your facility. These fields will be determined during the integration process.

Receiving the Message

There are two ways to receive the push messages published by Kargo.

  1. Expose a webhook that Kargo can send a payload to with the pallet data.
  2. Use the pushMessages Query and optionally the pushMessage Subscription in the GraphQL API to fetch the messages from a service you build.

Webhook Method

Receiving the message is as simple as exposing an endpoint that Kargo can call with the payload. The most important part to negotiate during the integration process is the authorization scheme and any firewall rules that need to be set up.

Authentication

Kargo is capable of implementing any authentication scheme but a few common options are.

Basic Authentication

For basic authentication we can agree on a username/password and it can be passed in the Authorization header. Then it can be encoded in base64 as <user:pass> and passed in as Basic <value> through that Authorization header.

Token Authentication

For Token based auth, you can provide Kargo with a username/password and a login endpoint to fetch that token from. Then along with every request we will ensure we have a fresh token and pass in the Authorization header as prescribed by your token auth scheme.

Firewall Rules

The Kargo push service will always push from the same IP in case you need to whitelist the IP it can be provided during the integration dicussion.

Example Call

Below is an example call that would simulate a call from the Kargo system to the customer server under the webhook scheme. As you can see, the payload will come in as part of a POST request to the /api/palletEvent route that you expose on your server. In this example we use Basic Authorization.

Host: <customer_name-server.com> Port: <port> Route: /api/palletEvent

curl --location --request POST \
"https://<customer_name-server.com>:<port>/api/palletEvent" \
--header "Authorization: Basic <value>" \
--header "Content-Type: application/json" \
--data-raw '<payload>'

GraphQL Method

If exposing a webhook is not an option with your tech stack, it is also possible to use the GraphQL API in order to fetch the messages that would have been pushed to the webhook. There are two GraphQL methods that are exposed in order to make this as convenient as possible.

Each of these messages include an ID, a best practice is to store this ID in some local state to ensure that messages aren't processed more than once.

pushMessages Query

The pushMessages query can be used to fetch all the messages that have occured at a facility or for a business since a given timestamp.

Example Usage

Query:

query PushMessages($input: PushMessageFilter!) {
pushMessages(input: $input) {
id
businessSlug
facilitySlug
messageType
message
sentAt
}
}

Variables:

{
"input": {
"businessSlug": "business",
"since": "2024-01-01T00:00:00+06:00"
}
}

Example Response:

{
"data": {
"pushMessages": [
{
"id": 1,
"businessSlug": "business",
"facilitySlug": "facility",
"messageType": "PALLET_EVENT",
"message": "<payload>",
"sentAt": "2024-09-13T23:30:00+06:00"
},
{
"id": 2,
"businessSlug": "business",
"facilitySlug": "facility",
"messageType": "PALLET_EVENT",
"message": "<payload>",
"sentAt": "2024-09-13T23:45:00+06:00"
}
]
}
}

pushMessage Subscription

The pushMessage subscription can be used to receive messages as they're published in order to decrease any latency receiving the messages. In conjunction with the pushMessages query on service startup to load any messages since the last time the service was run, this can provide the push messages real-time. The GraphQL subscription is run over a websocket so additional firewall rules may need to be applied in this case.

subscription PushMessage($filter: PushMessageSubscriptionFilter!) {
pushMessage(filter: $filter) {
id
businessSlug
facilitySlug
messageType
message
sentAt
}
}

Variables:

{
"filter": {
"businessSlug": "business"
}
}

Example Response:

{
"data": {
"pushMessage": {
"id": 3,
"businessSlug": "business",
"facilitySlug": "facility",
"messageType": "PALLET_EVENT",
"message": "<payload>",
"sentAt": "2024-09-13T23:50:00+06:00"
}
}
}

By using the pushMessages query to load historical messages and the pushMessage subscription to receive real-time updates, you can ensure that your service has the most up-to-date information with minimal latency.