Configuration: the plugin associated with the source model may provide pre-configured options. Inference configs for all models are almost the same but may have some differences. Read "Configurations" chapter to learn mode. Inference configuration is a set of JSON-based settings that are passed directly to the model. Depending on the model, you can choose desirable classes, GPU device to use and other options.