MIMIC SFLOW Protocol Module Guide

  1. Table of Contents

  2. Overview

    The MIMIC SFLOW Protocol Module is an optional facility that simulates the standard sFlow service as detailed at sFlow.org.

  3. Installation

    sFlow support is made available in MIMIC as an optional dynamically loadable module. Starting with MIMIC 12.00, you can use the Protocol Wizard to install the SFLOW module. If you prefer to enable SFLOW by hand, you need to do the following:

    • Use File->Terminate to stop the any running MIMIC daemon.

    • Copy the SFLOW shared library (sflow.dll on Windows, sflow.so on Unix) from "bin/dynamic/optional" to "bin/dynamic" in the install directory.

    • Install the license keys as detailed in the instructions e-mailed to you.

    • Restart MIMIC. You should see the following type of message in the MIMICLog that confirms that the SFLOW module was properly loaded :
      INFO  - SFLOW : Loaded protocol from < path-to-DLL >
      INFO  - SFLOW v12.00
      

    Once SFLOW is loaded, any agent instance configured to support the sFlow services will be able to send sFlow data to a sFlow collector.

  4. Using SFLOW from MIMICView

    If the SFLOW module is enabled, then Agent->Add, Agent->Configure and Agent->Paste dialogs will display SFLOW as an additional checkbox in the Advanced pane along with the SNMP protocols. On selecting the checkbox a new SFLOW pane will appear.

    This SFLOW configuration pane lets the user configure the parameters for a SFLOW session:

    • Config file

      This mandatory parameter specifies the SFLOW configuration file which determines what sFlow data is generated. You will not be able to start the agent unless this parameter is set.

      The configuration file is detailed below. You can either edit configuration files directly, or use the sFlow Wizard.

    • Collector

      This optional parameter specifies the address of the collector. If there is not at least one collector defined, then no flows are exported. Multiple collectors can only be set through the scripting interface as shown below.

    • Collector Port

      This optional parameter determines the collector port to use. The default port is 6343.

    • Exports per minute

      This optional parameter specifies the frequency of samples generated. The default is 6 per minute (10 second interval).

    • Encoding (Compact/Expanded)

      This optional parameter controls the encoding of sample values with a default compact encoding.

    • Samples (Both/Flow/Counter)

      This optional parameter specifies whether to include Flow or Sample counters or Both (the default).

    • Records/Sample (All/Single/Random)

      This optional parameter controls the records per sample. The default is All.

    • Samples/Datagram (All/Single/Random)

      This optional parameter allows you to specify the number of samples per datagram (default All).

    • Maximum datagram size

      This optional parameter sets the maximum datagram size, with a default of 1400.

  5. Using SFLOW from MIMICShell

    A few new commands and some enhanced old commands can be used from the MIMICShell to control the SFLOW functionality. Here is a synopsis:

    • mimic protocol msg SFLOW get args

      This command lets the user gather the self-defining list of arguments required and their particulars. The parameters are detailed above. A sample exchange for this command would be:

      mimicsh> mimic protocol msg SFLOW get args
      {{filename} {Config File} {file} {scripts/sflow {{*.cfg {sflow config files}
      {edit yes} {new yes}}} - both} {mandatory} {}}
      {{collector} {Collector} {string} {} {mandatory} {}}
      {{collectorport} {Collector Port} {string} {} {mandatory} {6343}}
      {{flows_per_min} {Exports per minute} {number} {} {optional} {6}}
      {{encoding_type} {Encoding (Compact/Expanded)} {radio} {} {optional} {Compact}}
      {{include_samples} {Samples (Both/Flow/Counter)} {string} {} {optional} {All}}
      {{records_per_sample} {Records/Sample (All/Single/Random)} {string} {} {optional} {All}}
      {{samples_per_datagram} {Samples/Datagram (All/Single/Random)} {string} {} {optional} {All}} 
      {{max_datagram_size} {Maximum datagram size} {number} {} {optional} {1400}}
      
      

    • mimic agent get protocol

      This command lets the user look at the protocols currently configured on the agent. A sample exchange for this command would be:

        mimicsh> mimic agent get protocol
        snmpv1,snmpv2c,SFLOW
      

    • mimic agent set protocol

      This command lets the user change the protocol setting for an agent. A sample exchange for this command would be:

        mimicsh> mimic agent get protocol
        snmpv1
        mimicsh> mimic agent set protocol snmpv1,SFLOW
        mimicsh> mimic agent get protocol
        snmpv1,SFLOW
      

    • mimic agent protocol msg SFLOW get config

      This command lets the user get the current argument settings. A sample exchange for this command would be:

        mimicsh> mimic agent protocol msg SFLOW get config
        {filename=} {collector=} {collectorport=6343} {flows_per_min=6}
        {encoding_type=Compact} {include_samples=All} {records_per_sample=All}
        {samples_per_datagram=All} {max_datagram_size=}
      
      
      

    • mimic agent protocol msg SFLOW set config [config]

      This command lets the user change the current argument settings of all SFLOW sessions for an agent. A sample exchange for this command would be:

        mimicsh> mimic agent protocol msg SFLOW get config
        {filename=} {collector=} {collectorport=6343}
      
        mimicsh> mimic agent protocol msg SFLOW set config \{collector=192.9.200.71 192.9.200.72\}
      
        mimicsh>  mimic agent protocol msg SFLOW get config
        {filename=} {collector=192.9.200.71 192.9.200.72} {collectorport=6343}
      

    • mimic agent protocol msg SFLOW get trace
      mimic agent protocol msg SFLOW set trace [0 or 1]

      This command lets the user change the SFLOW tracing configuration for an agent. A sample exchange would be:

        mimicsh> mimic agent assign 1
      
        mimicsh> mimic agent protocol msg SFLOW get trace
        0
        mimicsh>  mimic agent protocol msg SFLOW set trace 1
      
        mimicsh> mimic agent protocol msg SFLOW get trace
        1
      
      and the log would show:

      INFO  01/23.11:45:35 - agent 1 trace enabled for SFLOW
      INFO  01/23.11:45:35 - SFLOW[AGT=1]: sent to [192.9.200.71,6343] - V5 datagram
      ...
      

    • mimic protocol msg SFLOW get stats_hdr
      mimic agent protocol msg SFLOW get statistics

      Returns SFLOW statistics information:

      • a list of statistic headers, and
      • current statistics values for the specified server.

      In order, the statistic values are:

      • Total number of SFLOW packets sent.
      • Total number of SFLOW packets discarded.

      A sample exchange for these commands would be:

        mimicsh> mimic protocol msg SFLOW get stats_hdr
        {{pktSnt} {PktsSent}} {{pktDisc} {PktsDiscarded}}
      
        mimicsh> mimic agent protocol msg SFLOW get statistics
        190 0
      

    • mimic agent protocol msg SFLOW halt
      mimic agent protocol msg SFLOW reload
      mimic agent protocol msg SFLOW resume

      This group of command lets the user reload the configuration file for an agent without stopping it. This in effect allows dynamic reconfiguration of the flows generated by this exporter. The flow generation needs to first be halted, then the reload command reloads the configuration file, and the resume command continues flow generation.

  6. Recording sFlow
  7. To create a default simulation, you can record from a sFlow packet capture (PCAP) with the sflowrec utility. This tool work in either of 2 modes:

    • in file mode, a previously captured PCAP file is read in

    • in collector mode, it captures live sFlow packets from an Exporter and saves them in a temporary PCAP file

    It looks at sFlow packets in the PCAP file, and for the first Exporter it finds, creates a simulation that attempts to cause the sFlow module to generate equivalent flows. For each flow source (combination of UDP port and sub-agent ID) from the Exporter IP address a sample set will generated for all the flows from that source.

    For better recording, here are the recommendations on how to capture sFlow packets with a packet capture program like Wireshark :

    • in order to capture large, fragmented packets, it is better to filter by exporter or collector IP address than by collector port. If one does the latter, then the filter will discard subsequent fragments in fragmented IP packets, leading to missed sFlow packets in the capture.

    The sFlow Wizard gives you a user-friendly interface in front of this tool.

    This section documents the command-line options to this utility, either

    • --file pcap-file

      read sFlow packets from the specified packet capture (PCAP) file.

    or

    • --localport port

      specifying this option causes sflowrec to act as a Collector, reading packets from the specified port.

    • --count count

      optional argument to specify how many packets to collect. If not specified, then the collector will capture packets until interrupted (eg. with CTL-C).

    and the common options

    • --out output-file

      this optional argument specifies the output file name. By default, the output file name will be the same as the PCAP file, but with the .cfg suffix.

    • --exporter host-address

      this optional argument specifies a host-address to filter the captured packets. Only packets from this exporter will be considered.

    • --collector host-address

      this optional argument specifies a host-address to filter the captured packets. Only packets from this collector will be considered.

    • --port port

      this optional argument specifies a port to filter the captured packets. Only packets to the specified collector port will be considered for simulation.

    • --start start
    • --stop stop

      with these 2 options you can specify a range of packets to be recorded.

    • --exclude templatefile

      by default, certain flows will be simulated using a template in template/sflow/. This option allows you to prevent that.

    • --sequential

      by default, integer fields found to be in a range of values will be simulated as returning random numbers in the range (via the range.mtcl action script). By specifying this option, you will force sequential values in the range (via the seq.mtcl action).

  8. Simulation Configuration
  9. sFlow data to be generated by the simulated exporter is specified by the configuration file loaded into the agent instance. The sFlow Wizard gives you a user-friendly interface to edit a sFlow configuration file.

    • Global parameters

      These apply to all the sample sets defined in this configuration file.
      Configurable Description
      Comments User editable comment about the configuration. This value does not impact the simulation and is solely intended for self-documentation.

    • Includes

      The files containing definitions of the sFlow structures used in this configuration have to be included in this section.

    • Sample sets

      Each configuration generates one or more sample sets as defined here.
      Configurable Description
      sub_agent_id Used to distinguishing between datagram streams from separate agent sub entities within an device.
      sys_uptime_offset Offset from system uptime for this sample set for the uptime field in the packet header.
      frame_sequence Incremented with each sample datagram generated by a sub-agent within an agent for the sequence_number field in the packet header.
      flow_sequence Incremented with each flow sample generated by this source_id for the sequence_number field in the flow sample.
      counter_sequence Incremented with each counter sample generated by this source_id for the sequence_number field in the counter sample.

      A sample set has counters samples and / or flow samples.
      Configurable Description
      data_source sFlowDataSource - the encoding is as in the sFlow specification.
      The following formats are supported:

      • i, a positive integer - a combined value of datasource type and datasource id;

      • t:i, positive integers - datasource type t and datasource id i;

      • i1 i2 i3, positive integers - multiple combined datasource type and id;

      • t1:i1 t2:i2 t3:i3, positive integers - multiple datasource type t* and id i*;

      • action-script.mtcl - an action script that dynamically generates the datasources;

      • *Entry - generates instances of the tabular *Entry, eg. ifEntry;
      sample One or more sample blocks with a struct defined in the includes section.

  10. Compatibility
  11. Click here for the compatibility document. If you get an error, you need to download the optional update package with the Update Wizard.