简体   繁体   中英

Returning an empty array instead of null

I am wanting to return an empty array. However, it keeps returning null. How can I resolve this in my code

I am wanting the mentions array to return blank on this occasion, but it keeps coming back as null

function getTweetData(tweet) {
  let hashtag = tweet.match(/#\w+/g)
  let atSign = tweet.match(/@\w+/g)

  let tweetObj = {
    tags: hashtag,
    mentions: atSign,
    tagCount: 0,
    mentionCount: 0,
    length: tweet.length
  }

  if (tweet.match(/#/g))
    tweetObj.tagCount++

  if (tweet.match(/@/g))
    tweetObj.mentionCount++

  console.log(tweetObj)
  return tweetObj
}

Test

it('Should increase the count of tags', () => {
    expect(getTweetData('My awesome tweet about #coding')).to.eql({ tags: ['#coding'], mentions: [], tagCount: 1, mentionCount: 0, length: 30 })
  });

You can modify:

let atSign = tweet.match(/@\w+/g)

to:

let atSign = tweet.match(/@\w+/g) || []

Adding || [] || [] will assign an empty array to atSign in a case where your regex returns null

You can do this:

let hashtag = tweet.match(/#\w+/g) || []
let atSign = tweet.match(/@\w+/g)  || []

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM