Skip to content
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.

Refactor refs-local to not use ipfs.add #2980

Closed
wants to merge 1 commit into from
Closed

Refactor refs-local to not use ipfs.add #2980

wants to merge 1 commit into from

Conversation

aphelionz
Copy link
Contributor

This PR refactors ipfs.add away from the refs-local test, relying instead on the more basic block.put. This is based on the need for ipfs-rust/rust-ipfs to work through the conformance testing in the order planned in the dev grant.

Basing this work off of the techniques used in #2972

@aphelionz
Copy link
Contributor Author

aphelionz commented Apr 10, 2020

I believe I'm very close but just need some guidance on how to properly structure the DAGLinks and DAGNode

Expected root hash: QmR4nFjTu18TyANgC65ArNWp5Yaab1gPzQ4D8zp7Kx3vhr
Actual root hash: QmPS3PQjdNo9Gfp7myZMWKYeyLhUziy12vP1WhZq4QmYYC

@achingbrain
Copy link
Member

The second file. holmes.txt is large enough to be split over three DAGNodes, so you need to create a balanced DAG:

it('should get local refs', async function () {
  const createFile = async (data, chunkSize = 262144) => {
    let chunks = []

    for (let i = 0; i < data.length; i += chunkSize) {
      const unixfs = new UnixFS({
        type: 'file',
        data: data.slice(i, i + chunkSize)
      })
      const dagNode = new DAGNode(unixfs.marshal())
      const block = await ipfs.block.put(dagNode.serialize())

      chunks.push({
        unixfs,
        size: block.data.length,
        cid: block.cid
      })
    }

    if (chunks.length === 1) {
      return {
        cid: chunks[0].cid,
        cumulativeSize: chunks[0].size
      }
    }

    const unixfs = new UnixFS({
      type: 'file',
      blockSizes: chunks.map(chunk => chunk.unixfs.fileSize())
    })
    const dagNode = new DAGNode(unixfs.marshal(), chunks.map(chunk => new DAGLink('', chunk.size, chunk.cid)))
    const block = await ipfs.block.put(dagNode.serialize())

    return {
      cid: block.cid,
      cumulativeSize: chunks.reduce((acc, curr) => acc + curr.size, 0) + block.data.length
    }
  }

  const pp = await createFile(fixtures.directory.files['pp.txt'])
  const holmes = await createFile(fixtures.directory.files['holmes.txt'])
  const directory = new UnixFS({ type: 'directory' })
  const serialized = new DAGNode(directory.marshal(), [
    new DAGLink('pp.txt', pp.cumulativeSize, pp.cid),
    new DAGLink('holmes.txt', holmes.cumulativeSize, holmes.cid)
  ]).serialize()
  await ipfs.block.put(serialized)

  const refs = await all(ipfs.refs.local())

  const cids = refs.map(r => r.ref)
  expect(cids).to.include('QmVwdDCY4SPGVFnNCiZnX5CtzwWDn6kAM98JXzKxE3kCmn')
  expect(cids).to.include('QmR4nFjTu18TyANgC65ArNWp5Yaab1gPzQ4D8zp7Kx3vhr')
})

Probably want to put the createFile function somewhere useful otherwise our tests are going to end up very verbose.

Update refs-local.js

feat: createFile method from @achingbrain

fix: linting
@aphelionz
Copy link
Contributor Author

@achingbrain much better, thank you for the help

@aphelionz
Copy link
Contributor Author

I believe the errors here and in #2982 appear to be unrelated to the changeset in the PRs

achingbrain added a commit that referenced this pull request Apr 24, 2020
Follows on from #2980 but uses the unixfs-importer which has been
refactored to only use the block API, allowing implementations to be
written from low-level APIs upwards.
achingbrain added a commit that referenced this pull request Apr 24, 2020
Follows on from #2980 but uses the unixfs-importer which has been
refactored to only use the block API, allowing implementations to be
written from low-level APIs upwards.
@achingbrain
Copy link
Member

I've opened #3005 which accomplishes the same thing, but uses less code :)

@aphelionz
Copy link
Contributor Author

aphelionz commented Apr 24, 2020

Go ahead and close this if it works for you!

@achingbrain
Copy link
Member

Fixed in #3005

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants