简体   繁体   中英

Visual Studio Load CPU usage skyrockets with Json Context Parameters

I've used Visual Studio 2015 to record a WebPerformance test that is used by a Load Test. After the inital recording I could run 20 simultanious users with only about 25% CPU usage. However the website I'm testing on uses Json, so to make the tests more realistic I added a custom extraction rule for Json:

[DisplayName("JSON Extraction Rule")]
[Description("Extracts the specified JSON value from an object.")]
public class JsonExtractionRule : ExtractionRule
{
    [DisplayName("Name/path of attribute")]
    [Description("String to fetch the attribute from the Json Object.")]
    public string Name { get; set; }

    [DisplayName("Fetch 4 first characters only")]
    [Description("Set to true if only the first 4 characters of the attribute should be fetched.")]
    public Boolean fourFirstCharacters { get; set; }

    public override void Extract(object sender, ExtractionEventArgs e)
    {
        if (e.Response.BodyString != null)
        {
            var json = e.Response.BodyString;

            if (json.StartsWith("["))
            {
                json = "{\"array\":" + json + "}";
            }

            var data = JObject.Parse(json);

            if (data != null)
            {
                var attribute = data.SelectToken(Name).ToString();
                if (fourFirstCharacters)
                {
                    e.WebTest.Context.Add(this.ContextParameterName, attribute.Substring(0, 4));
                } else
                {
                    e.WebTest.Context.Add(this.ContextParameterName, attribute);
                }                   
                e.Success = true;
                return;
            }
        }

        e.Success = false;
        e.Message = String.Format(CultureInfo.CurrentCulture, "Not Found: {0}", Name);
    }
}

I used this to extract several Json attributes from the response and added them to subsequent requests using context parameters.

After adding several extractions likese and passing on some context parameters to the requests my Load Test reaches 100% CPU usage with only 5 simultanious users.

The above extraction rule is used about 20 times in the WebPerformance test, and between 0-5 times on any single response. The Json responses and requests are about 2k characters long. The largest extraction consists of about 500 characters.

I could understand if the response times went up or something like that as a result of more realistic Json being passed. However I do not understand why CPU usage skyrockets? Can the response time of the requests affect CPU usage from the Load Test?

Is extractions + context parameters a bad(in context of CPU usage) way of implementing the Json in the requests? Is there a smarter way to do this that would save CPU usage?

The majority of the CPU usage seems to have been caused by using the above custom Json extraction in a validation rule. I removed it since the validation wasn't critical and everything ran more smoothly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM