I've been doing some testing with rate limiting on an X440 in the lab. Fairly standard, boring, test setup.
Ingress using a meter:
configure meter rl_50meg committed-rate 50000 Kbps max-burst-size 50000 Kb out-actions drop
Egress using rate-limit egress:
config port 3 rate-limit egress 50000 Kbps max-burst-size 50000 Kb
The policy applied to ingress on port 3 just sets the meter rl_50meg, nothing else.
The rate limit does as you expect, and graphing at 1 minute intervals whilst pushing traffic through gives a fairly straight line at around 49Mbit/sec. All happy there.
When the sample period is decreased (ie: resolution increases) some strange artefacts start being observed. At a 5 second poll interval (I didn't go any faster) there are distinct spikes in the graph up to 60Mbit every 60 seconds (see graph below). Is this related to the way the hardware is performing the rate limiting?
This is more a "Curious to know what's happening" question than a problem.