Compare commits

...

101 Commits

Author SHA1 Message Date
Georgi Matev
880cb899b5
Update README.md (#5466)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic
words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-10-11 11:42:38 -07:00
Georgi Matev
779bb70301
Update README.md type (#5465)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic
words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-10-11 11:38:56 -07:00
Georgi Matev
2487072d95
Update README.md with archival notice (#5463)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [x] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic
words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-10-11 11:33:20 -07:00
dependabot[bot]
ad927afbc1
⬆️ Bump sass from 1.78.0 to 1.79.1 in /website (#5444)
Bumps [sass](https://github.com/sass/dart-sass) from 1.78.0 to 1.79.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.79.1</h2>
<p>To install Sass 1.79.1, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li>No user-visible changes.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1791">full changelog</a> for changes in earlier releases.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.79.1</h2>
<ul>
<li>No user-visible changes.</li>
</ul>
<h2>1.79.0</h2>
<ul>
<li>
<p><strong>Breaking change</strong>: Passing a number with unit <code>%</code> to the <code>$alpha</code> parameter
of <code>color.change()</code>, <code>color.adjust()</code>, <code>change-color()</code>, and <code>adjust-color()</code>
is now interpreted as a percentage, instead of ignoring the unit. For example,
<code>color.change(red, $alpha: 50%)</code> now returns <code>rgb(255 0 0 / 0.5)</code>.</p>
</li>
<li>
<p><strong>Potentially breaking compatibility fix</strong>: Sass no longer rounds RGB channels
to the nearest integer. This means that, for example, <code>rgb(0 0 1) != rgb(0 0 0.6)</code>. This matches the latest version of the CSS spec and browser behavior.</p>
</li>
<li>
<p><strong>Potentially breaking compatibility fix</strong>: Passing large positive or negative
values to <code>color.adjust()</code> can now cause a color's channels to go outside that
color's gamut. In most cases this will currently be clipped by the browser and
end up showing the same color as before, but once browsers implement gamut
mapping it may produce a different result.</p>
</li>
<li>
<p>Add support for CSS Color Level 4 [color spaces]. Each color value now tracks
its color space along with the values of each channel in that color space.
There are two general principles to keep in mind when dealing with new color
spaces:</p>
<ol>
<li>
<p>With the exception of legacy color spaces (<code>rgb</code>, <code>hsl</code>, and <code>hwb</code>), colors
will always be emitted in the color space they were defined in unless
they're explicitly converted.</p>
</li>
<li>
<p>The <code>color.to-space()</code> function is the only way to convert a color to
another color space. Some built-in functions may do operations in a
different color space, but they'll always convert back to the original space
afterwards.</p>
</li>
</ol>
</li>
<li>
<p><code>rgb</code> colors can now have non-integer channels and channels outside the normal
gamut of 0-255. These colors are always emitted using the <code>rgb()</code> syntax so
that modern browsers that are being displayed on wide-gamut devices can
display the most accurate color possible.</p>
</li>
<li>
<p>Add support for all the new color syntax defined in Color Level 4, including:</p>
<ul>
<li><code>oklab()</code>, <code>oklch()</code>, <code>lab()</code>, and <code>lch()</code> functions;</li>
<li>a top-level <code>hwb()</code> function that matches the space-separated CSS syntax;</li>
<li>and a <code>color()</code> function that supports the <code>srgb</code>, <code>srgb-linear</code>,
<code>display-p3</code>, <code>a98-rgb</code>, <code>prophoto-rgb</code>, <code>rec2020</code>, <code>xyz</code>, <code>xyz-d50</code>, and
<code>xyz-d65</code> color spaces.</li>
</ul>
</li>
<li>
<p>Add new functions for working with color spaces:</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="5fa04d3dbc"><code>5fa04d3</code></a> Fix sass-parser publishing (<a href="https://redirect.github.com/sass/dart-sass/issues/2349">#2349</a>)</li>
<li><a href="d740d02e10"><code>d740d02</code></a> Emit deprecation warnings for the legacy JS API (<a href="https://redirect.github.com/sass/dart-sass/issues/2343">#2343</a>)</li>
<li><a href="a957eeadd1"><code>a957eea</code></a> Bump chokidar to v4 (<a href="https://redirect.github.com/sass/dart-sass/issues/2347">#2347</a>)</li>
<li><a href="aa35aa20dd"><code>aa35aa2</code></a> Bump bufbuild/buf-setup-action in /.github/util/initialize (<a href="https://redirect.github.com/sass/dart-sass/issues/2346">#2346</a>)</li>
<li><a href="f826ed2e54"><code>f826ed2</code></a> Stop emitting <code>mixed-decls</code> in a bunch of unnecessary cases (<a href="https://redirect.github.com/sass/dart-sass/issues/2342">#2342</a>)</li>
<li><a href="2f0d0daaf4"><code>2f0d0da</code></a> Merge pull request <a href="https://redirect.github.com/sass/dart-sass/issues/2341">#2341</a> from sass/feature.color-4</li>
<li><a href="de181d9192"><code>de181d9</code></a> Poke CI</li>
<li><a href="34f98c703b"><code>34f98c7</code></a> Update color API tests</li>
<li><a href="422f037ebd"><code>422f037</code></a> Fix a typo</li>
<li><a href="4db68a1d4f"><code>4db68a1</code></a> Merge pull request <a href="https://redirect.github.com/sass/dart-sass/issues/2339">#2339</a> from sass/merge-main</li>
<li>Additional commits viewable in <a href="https://github.com/sass/dart-sass/compare/1.78.0...1.79.1">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.78.0&new-version=1.79.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-09-18 05:58:05 +00:00
ashmrtn
b086f8c3ff
Use new client created for PnP ops in purge script (#5442)
PowerShell switched to requiring certificate credentials so the existing cleanup jobs have been failing since the switch

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [x] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-09-17 23:08:14 +00:00
dependabot[bot]
d9bf48be7e
⬆️ Bump dompurify from 3.0.6 to 3.1.6 in /website (#5437)
Bumps [dompurify](https://github.com/cure53/DOMPurify) from 3.0.6 to 3.1.6.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/cure53/DOMPurify/releases">dompurify's releases</a>.</em></p>
<blockquote>
<h2>DOMPurify 3.1.6</h2>
<ul>
<li>Fixed an issue with the execution logic of attribute hooks to prevent bypasses, thanks <a href="https://github.com/kevin-mizu"><code>@​kevin-mizu</code></a></li>
<li>Fixed an issue with element removal leading to uncaught errors through DOM Clobbering, thanks <a href="https://github.com/realansgar"><code>@​realansgar</code></a></li>
<li>Fixed a minor problem with the bower file pointing to the wrong dist path</li>
<li>Fixed several minor typos in docs, comments and comment blocks, thanks <a href="https://github.com/Rotzbua"><code>@​Rotzbua</code></a></li>
<li>Updated several development dependencies</li>
</ul>
<h2>DOMPurify 3.1.5</h2>
<ul>
<li>Fixed a minor issue with the dist paths in <code>bower.js</code>, thanks <a href="https://github.com/HakumenNC"><code>@​HakumenNC</code></a></li>
<li>Fixed a minor issue with sanitizing HTML coming from copy&amp;paste Word content, thanks <a href="https://github.com/kakao-bishop-cho"><code>@​kakao-bishop-cho</code></a></li>
</ul>
<h2>DOMPurify 3.1.4</h2>
<ul>
<li>Fixed an issue with the recently implemented <code>isNaN</code> checks, thanks <a href="https://github.com/tulach"><code>@​tulach</code></a></li>
<li>Added several new popover attributes to allow-list, thanks <a href="https://github.com/Gigabyte5671"><code>@​Gigabyte5671</code></a></li>
<li>Fixed the tests and adjusted the test runner to cover all branches</li>
</ul>
<h2>DOMPurify 3.1.3</h2>
<ul>
<li>Fixed several mXSS variations found by and thanks to <a href="https://github.com/kevin-mizu"><code>@​kevin-mizu</code></a> &amp; <a href="https://github.com/Ry0taK"><code>@​Ry0taK</code></a></li>
<li>Added better configurability for comment scrubbing default behavior</li>
<li>Added better hardening against Prototype Pollution attacks, thanks <a href="https://github.com/kevin-mizu"><code>@​kevin-mizu</code></a></li>
<li>Added better handling and readability of the <code>nodeType</code> property, thanks <a href="https://github.com/ssi02014"><code>@​ssi02014</code></a></li>
<li>Fixed some smaller issues in README and other documentation</li>
</ul>
<h2>DOMPurify 3.1.2</h2>
<ul>
<li>Addressed and fixed a mXSS variation found by <a href="https://github.com/kevin-mizu"><code>@​kevin-mizu</code></a></li>
<li>Addressed and fixed a mXSS variation found by <a href="https://twitter.com/hash_kitten">Adam Kues</a> of Assetnote</li>
<li>Updated tests for older Safari and Chrome versions</li>
</ul>
<h2>DOMPurify 3.1.1</h2>
<ul>
<li>Fixed an mXSS sanitiser bypass reported by <a href="https://github.com/icesfont"><code>@​icesfont</code></a></li>
<li>Added new code to track element nesting depth</li>
<li>Added new code to enforce a maximum nesting depth of 255</li>
<li>Added coverage tests and necessary clobbering protections</li>
</ul>
<p><strong>Note that this is a security release and should be upgraded to immediately. Please also note that further releases may follow as the underlying vulnerability is apparently new and further variations may be discovered.</strong></p>
<h2>DOMPurify 3.1.0</h2>
<ul>
<li>Added new setting <code>SAFE_FOR_XML</code> to enable better control over comment scrubbing</li>
<li>Updated README to warn about <em>happy-dom</em> not being safe for use with DOMPurify yet</li>
<li>Updated the LICENSE file to show the accurate year number</li>
<li>Updated several build and test dependencies</li>
</ul>
<h2>DOMPurify 3.0.11</h2>
<ul>
<li>Fixed another conditional bypass caused by Processing Instructions, thanks <a href="https://github.com/Ry0taK"><code>@​Ry0taK</code></a></li>
<li>Fixed the regex for HTML Custom Element detection, thanks <a href="https://github.com/AlekseySolovey3T"><code>@​AlekseySolovey3T</code></a></li>
</ul>
<h2>DOMPurify 3.0.10</h2>
<ul>
<li>Fixed two possible bypasses when sanitizing an XML document and later using it in HTML, thanks <a href="https://github.com/Slonser"><code>@​Slonser</code></a></li>
<li>Bumped up some build and test dependencies</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="4083a9096b"><code>4083a90</code></a> Merge pull request <a href="https://redirect.github.com/cure53/DOMPurify/issues/978">#978</a> from cure53/main</li>
<li><a href="90a10a14af"><code>90a10a1</code></a> fix: Fixed a typo on the README</li>
<li><a href="65df0428f0"><code>65df042</code></a> chore: Preparing 3.1.6 release</li>
<li><a href="6e03334bab"><code>6e03334</code></a> fix: Made sure that remove() is not called directly from node</li>
<li><a href="00fc06cf57"><code>00fc06c</code></a> fix: Fixed a DOM clobbering issue leading to an error being thrown</li>
<li><a href="f8c2ef5911"><code>f8c2ef5</code></a> Merge pull request <a href="https://redirect.github.com/cure53/DOMPurify/issues/977">#977</a> from cure53/dependabot/npm_and_yarn/multi-99ca4f73d8</li>
<li><a href="e5112ec40a"><code>e5112ec</code></a> build(deps): bump ws and socket.io-adapter</li>
<li><a href="9978cecea2"><code>9978cec</code></a> docs: Added better security warning about SAFE_FOR_XML to README</li>
<li><a href="fa542df7e8"><code>fa542df</code></a> fix: Changed the order for attribute checks slightly for safer hooks</li>
<li><a href="b8b552cb21"><code>b8b552c</code></a> Merge pull request <a href="https://redirect.github.com/cure53/DOMPurify/issues/975">#975</a> from cure53/dependabot/npm_and_yarn/multi-2d3aef8690</li>
<li>Additional commits viewable in <a href="https://github.com/cure53/DOMPurify/compare/3.0.6...3.1.6">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=dompurify&package-manager=npm_and_yarn&previous-version=3.0.6&new-version=3.1.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/alcionai/corso/network/alerts).

</details>
2024-09-16 20:47:23 +00:00
dependabot[bot]
fe261b22c5
⬆️ Bump sass from 1.77.0 to 1.78.0 in /website (#5423)
Bumps [sass](https://github.com/sass/dart-sass) from 1.77.0 to 1.78.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.78.0</h2>
<p>To install Sass 1.78.0, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li>
<p>The <code>meta.feature-exists</code> function is now deprecated. This deprecation is named <code>feature-exists</code>.</p>
</li>
<li>
<p>Fix a crash when using <code>@at-root</code> without any queries or children in the indented syntax.</p>
</li>
</ul>
<h3>JS API</h3>
<ul>
<li>
<p>Backport the deprecation options (<code>fatalDeprecations</code>, <code>futureDeprecations</code>, and <code>silenceDeprecations</code>) to the legacy JS API. The legacy JS API is itself deprecated, and you should move off of it if possible, but this will allow users of bundlers and other tools that are still using the legacy API to still control deprecation warnings.</p>
</li>
<li>
<p>Fix a bug where accessing <code>SourceSpan.url</code> would crash when a relative URL was passed to the Sass API.</p>
</li>
</ul>
<h3>Embedded Sass</h3>
<ul>
<li>
<p>Explicitly expose a <code>sass</code> executable from the <code>sass-embedded</code> npm package. This was intended to be included in 1.63.0, but due to the way platform-specific dependency executables are installed it did not work as intended. Now users can run <code>npx sass</code> for local installs or just <code>sass</code> when <code>sass-embedded</code> is installed globally.</p>
</li>
<li>
<p>Add linux-riscv64, linux-musl-riscv64, and android-riscv64 support for the <code>sass-embedded</code> npm package.</p>
</li>
<li>
<p>Fix an edge case where the Dart VM could hang when shutting down when requests were in flight.</p>
</li>
<li>
<p>Fix a race condition where the embedded host could fail to shut down if it was closed around the same time a new compilation was started.</p>
</li>
<li>
<p>Fix a bug where parse-time deprecation warnings could not be controlled by the deprecation options in some circumstances.</p>
</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1780">full changelog</a> for changes in earlier releases.</p>
<h2>Dart Sass 1.77.8</h2>
<p>To install Sass 1.77.8, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li>No user-visible changes.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1778">full changelog</a> for changes in earlier releases.</p>
<h2>Dart Sass 1.77.5</h2>
<p>To install Sass 1.77.5, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.78.0</h2>
<ul>
<li>
<p>The <code>meta.feature-exists</code> function is now deprecated. This deprecation is
named <code>feature-exists</code>.</p>
</li>
<li>
<p>Fix a crash when using <code>@at-root</code> without any queries or children in the
indented syntax.</p>
</li>
</ul>
<h3>JS API</h3>
<ul>
<li>
<p>Backport the deprecation options (<code>fatalDeprecations</code>, <code>futureDeprecations</code>,
and <code>silenceDeprecations</code>) to the legacy JS API. The legacy JS API is itself
deprecated, and you should move off of it if possible, but this will allow
users of bundlers and other tools that are still using the legacy API to
still control deprecation warnings.</p>
</li>
<li>
<p>Fix a bug where accessing <code>SourceSpan.url</code> would crash when a relative URL was
passed to the Sass API.</p>
</li>
</ul>
<h3>Embedded Sass</h3>
<ul>
<li>
<p>Explicitly expose a <code>sass</code> executable from the <code>sass-embedded</code> npm package.
This was intended to be included in 1.63.0, but due to the way
platform-specific dependency executables are installed it did not work as
intended. Now users can run <code>npx sass</code> for local installs or just <code>sass</code> when
<code>sass-embedded</code> is installed globally.</p>
</li>
<li>
<p>Add linux-riscv64, linux-musl-riscv64, and android-riscv64 support for the
<code>sass-embedded</code> npm package.</p>
</li>
<li>
<p>Fix an edge case where the Dart VM could hang when shutting down when requests
were in flight.</p>
</li>
<li>
<p>Fix a race condition where the embedded host could fail to shut down if it was
closed around the same time a new compilation was started.</p>
</li>
<li>
<p>Fix a bug where parse-time deprecation warnings could not be controlled by
the deprecation options in some circumstances.</p>
</li>
</ul>
<h2>1.77.8</h2>
<ul>
<li>No user-visible changes.</li>
</ul>
<h2>1.77.7</h2>
<ul>
<li>
<p>Declarations that appear after nested rules are deprecated, because the
semantics Sass has historically used are different from the semantics
specified by CSS. In the future, Sass will adopt the standard CSS semantics.</p>
<p>See <a href="https://sass-lang.com/d/mixed-decls">the Sass website</a> for details.</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="90a70ef168"><code>90a70ef</code></a> Fix failing double check test for sass-parser (<a href="https://redirect.github.com/sass/dart-sass/issues/2330">#2330</a>)</li>
<li><a href="b1d5f987d9"><code>b1d5f98</code></a> Backport deprecation API to legacy JS API (<a href="https://redirect.github.com/sass/dart-sass/issues/2293">#2293</a>)</li>
<li><a href="56a42371e0"><code>56a4237</code></a> Delete unreachable <code>default</code> clause. (<a href="https://redirect.github.com/sass/dart-sass/issues/2323">#2323</a>)</li>
<li><a href="a7f623dd13"><code>a7f623d</code></a> Bump bufbuild/buf-setup-action in /.github/util/initialize (<a href="https://redirect.github.com/sass/dart-sass/issues/2319">#2319</a>)</li>
<li><a href="9f82850504"><code>9f82850</code></a> Ignore new <code>unreachable_switch_default</code> warning. (<a href="https://redirect.github.com/sass/dart-sass/issues/2318">#2318</a>)</li>
<li><a href="798cd7cf57"><code>798cd7c</code></a> Update pubspec.yaml (<a href="https://redirect.github.com/sass/dart-sass/issues/2321">#2321</a>)</li>
<li><a href="2bf3ae0eed"><code>2bf3ae0</code></a> Fix a comment (<a href="https://redirect.github.com/sass/dart-sass/issues/2316">#2316</a>)</li>
<li><a href="eb6c19e53c"><code>eb6c19e</code></a> Initial implementation of a PostCSS-compatible parser JS API (<a href="https://redirect.github.com/sass/dart-sass/issues/2304">#2304</a>)</li>
<li><a href="c3cccefe2e"><code>c3cccef</code></a> Bump dartdoc from 8.0.7 to 8.0.8 (<a href="https://redirect.github.com/sass/dart-sass/issues/2300">#2300</a>)</li>
<li><a href="f0a01829ce"><code>f0a0182</code></a> docs: Fix link to custom importer (<a href="https://redirect.github.com/sass/dart-sass/issues/2315">#2315</a>)</li>
<li>Additional commits viewable in <a href="https://github.com/sass/dart-sass/compare/1.77.0...1.78.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.77.0&new-version=1.78.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-09-04 05:54:58 +00:00
dependabot[bot]
18e3661289
⬆️ Bump webpack from 5.89.0 to 5.94.0 in /website (#5417)
Bumps [webpack](https://github.com/webpack/webpack) from 5.89.0 to 5.94.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/webpack/webpack/releases">webpack's releases</a>.</em></p>
<blockquote>
<h2>v5.94.0</h2>
<h2>Bug Fixes</h2>
<ul>
<li>Added runtime condition for harmony reexport checked</li>
<li>Handle properly <code>data</code>/<code>http</code>/<code>https</code> protocols in source maps</li>
<li>Make <code>bigint</code> optimistic when browserslist not found</li>
<li>Move <code>@​types/eslint-scope</code> to dev deps</li>
<li>Related in asset stats is now always an array when no related found</li>
<li>Handle ASI for export declarations</li>
<li>Mangle destruction incorrect with export named default properly</li>
<li>Fixed unexpected asi generation with sequence expression</li>
<li>Fixed a lot of types</li>
</ul>
<h2>New Features</h2>
<ul>
<li>Added new external type &quot;module-import&quot;</li>
<li>Support <code>webpackIgnore</code> for <code>new URL()</code> construction</li>
<li>[CSS] <code>@import</code> pathinfo support</li>
</ul>
<h2>Security</h2>
<ul>
<li>Fixed DOM clobbering in auto public path</li>
</ul>
<h2>v5.93.0</h2>
<h2>Bug Fixes</h2>
<ul>
<li>Generate correct relative path to runtime chunks</li>
<li>Makes <code>DefinePlugin</code> quieter under default log level</li>
<li>Fixed mangle destructuring default in namespace import</li>
<li>Fixed consumption of eager shared modules for module federation</li>
<li>Strip slash for pretty regexp</li>
<li>Calculate correct contenthash for CSS generator options</li>
</ul>
<h2>New Features</h2>
<ul>
<li>Added the <code>binary</code> generator option for asset modules to explicitly keep source maps produced by loaders</li>
<li>Added the <code>modern-module</code> library value for tree shakable output</li>
<li>Added the <code>overrideStrict</code> option to override strict or non-strict mode for javascript modules</li>
</ul>
<h2>v5.92.1</h2>
<h2>Bug Fixes</h2>
<ul>
<li>Doesn't crash with an error when the css experiment is enabled and contenthash is used</li>
</ul>
<h2>v5.92.0</h2>
<h2>Bug Fixes</h2>
<ul>
<li>Correct tidle range's comutation for module federation</li>
<li>Consider runtime for pure expression dependency update hash</li>
<li>Return value in the <code>subtractRuntime</code> function for runtime logic</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="eabf85d858"><code>eabf85d</code></a> chore(release): 5.94.0</li>
<li><a href="955e057abc"><code>955e057</code></a> security: fix DOM clobbering in auto public path</li>
<li><a href="9822387362"><code>9822387</code></a> test: fix</li>
<li><a href="cbb86ede32"><code>cbb86ed</code></a> test: fix</li>
<li><a href="5ac3d7f2cd"><code>5ac3d7f</code></a> fix: unexpected asi generation with sequence expression</li>
<li><a href="2411661bd1"><code>2411661</code></a> security: fix DOM clobbering in auto public path</li>
<li><a href="b8c03d4772"><code>b8c03d4</code></a> fix: unexpected asi generation with sequence expression</li>
<li><a href="f46a03ccbc"><code>f46a03c</code></a> revert: do not use heuristic fallback for &quot;module-import&quot;</li>
<li><a href="60f189871a"><code>60f1898</code></a> fix: do not use heuristic fallback for &quot;module-import&quot;</li>
<li><a href="66306aa456"><code>66306aa</code></a> Revert &quot;fix: module-import get fallback from externalsPresets&quot;</li>
<li>Additional commits viewable in <a href="https://github.com/webpack/webpack/compare/v5.89.0...v5.94.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=webpack&package-manager=npm_and_yarn&previous-version=5.89.0&new-version=5.94.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/alcionai/corso/network/alerts).

</details>
2024-08-30 22:06:35 +00:00
dependabot[bot]
d87e24d839
⬆️ Bump @docusaurus/plugin-google-gtag from 3.4.0 to 3.5.1 in /website (#5402)
Bumps [@docusaurus/plugin-google-gtag](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag) from 3.4.0 to 3.5.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/plugin-google-gtag</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.5.1 (2024-08-09)</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10384">#10384</a> fix(core): algolia context import (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10382">#10382</a> fix(theme-algolia): useDocusaurusContext import error (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
</ul>
</li>
</ul>
<h4>Committers: 2</h4>
<ul>
<li>Sunghyun Cho (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
<li>Sébastien Lorber (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
<h2>3.5.0 (2024-08-09)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10375">#10375</a> feat(blog): add <code>onUntruncatedBlogPosts</code> blog options (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10376">#10376</a> feat(theme): show unlisted/draft banners in dev mode (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9252">#9252</a> feat(blog): add feed xlst options to render beautiful RSS and Atom feeds (<a href="https://github.com/Xebec19"><code>@​Xebec19</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10216">#10216</a> feat(blog): authors page (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10339">#10339</a> feat(translation): add Estonian default translation (<a href="https://github.com/chirbard"><code>@​chirbard</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10325">#10325</a> feat(translations): Indonesian translation (<a href="https://github.com/priyadi"><code>@​priyadi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10335">#10335</a> feat(mdx-loader): wrap mdx content title (<code># Title</code>) in <code>&lt;header&gt;</code> for concistency (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10222">#10222</a> feat(blog): author header social icons (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-google-analytics</code>, <code>docusaurus-plugin-google-gtag</code>, <code>docusaurus-plugin-google-tag-manager</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-plugin-vercel-analytics</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10286">#10286</a> feat(core): allow plugins to self-disable by returning null (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10252">#10252</a> feat(blog): group sidebar items by year (<code>themeConfig.blog.sidebar.groupByYear</code>) (<a href="https://github.com/alicelovescake"><code>@​alicelovescake</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10224">#10224</a> feat(blog): warn duplicate and inline authors (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils-validation</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10241">#10241</a> feat(mdx): support recma plugins (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10344">#10344</a> fix(translations): fix wrong Estonian (et) translations and typos (<a href="https://github.com/Gekd"><code>@​Gekd</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10360">#10360</a> fix(translations): Fix and Improve Spanish translations (<a href="https://github.com/sergioalmela"><code>@​sergioalmela</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10235">#10235</a> fix(theme-translation): add missing German (de) theme.admonition translations (<a href="https://github.com/franzd1"><code>@​franzd1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10342">#10342</a> fix(search): fix algolia search ignore ctrl + F in search input (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/plugin-google-gtag</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.5.1 (2024-08-09)</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10384">#10384</a> fix(core): algolia context import (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10382">#10382</a> fix(theme-algolia): useDocusaurusContext import error (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
</ul>
</li>
</ul>
<h4>Committers: 2</h4>
<ul>
<li>Sunghyun Cho (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
<li>Sébastien Lorber (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
<h2>3.5.0 (2024-08-09)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10375">#10375</a> feat(blog): add <code>onUntruncatedBlogPosts</code> blog options (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10376">#10376</a> feat(theme): show unlisted/draft banners in dev mode (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9252">#9252</a> feat(blog): add feed xlst options to render beautiful RSS and Atom feeds (<a href="https://github.com/Xebec19"><code>@​Xebec19</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10216">#10216</a> feat(blog): authors page (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10339">#10339</a> feat(translation): add Estonian default translation (<a href="https://github.com/chirbard"><code>@​chirbard</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10325">#10325</a> feat(translations): Indonesian translation (<a href="https://github.com/priyadi"><code>@​priyadi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10335">#10335</a> feat(mdx-loader): wrap mdx content title (<code># Title</code>) in <code>&lt;header&gt;</code> for concistency (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10222">#10222</a> feat(blog): author header social icons (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-google-analytics</code>, <code>docusaurus-plugin-google-gtag</code>, <code>docusaurus-plugin-google-tag-manager</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-plugin-vercel-analytics</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10286">#10286</a> feat(core): allow plugins to self-disable by returning null (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10252">#10252</a> feat(blog): group sidebar items by year (<code>themeConfig.blog.sidebar.groupByYear</code>) (<a href="https://github.com/alicelovescake"><code>@​alicelovescake</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10224">#10224</a> feat(blog): warn duplicate and inline authors (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils-validation</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10241">#10241</a> feat(mdx): support recma plugins (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10344">#10344</a> fix(translations): fix wrong Estonian (et) translations and typos (<a href="https://github.com/Gekd"><code>@​Gekd</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10360">#10360</a> fix(translations): Fix and Improve Spanish translations (<a href="https://github.com/sergioalmela"><code>@​sergioalmela</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10235">#10235</a> fix(theme-translation): add missing German (de) theme.admonition translations (<a href="https://github.com/franzd1"><code>@​franzd1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10342">#10342</a> fix(search): fix algolia search ignore ctrl + F in search input (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="5acbc57bd6"><code>5acbc57</code></a> v3.5.1</li>
<li><a href="daa6b87f24"><code>daa6b87</code></a> chore: release Docusaurus v3.5 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10379">#10379</a>)</li>
<li><a href="afa9fcc965"><code>afa9fcc</code></a> docs(plugin-google-gtag): replace the broken Google Developers links with val...</li>
<li><a href="80203b385d"><code>80203b3</code></a> feat(core): allow plugins to self-disable by returning null (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10286">#10286</a>)</li>
<li><a href="6dd9a5076e"><code>6dd9a50</code></a> chore: simplify TypeScript configs, use TS 5.5 configDir placeholder (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10256">#10256</a>)</li>
<li><a href="dbdd4dfb2e"><code>dbdd4df</code></a> chore: release Docusaurus v3.4 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10186">#10186</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.5.1/packages/docusaurus-plugin-google-gtag">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/plugin-google-gtag&package-manager=npm_and_yarn&previous-version=3.4.0&new-version=3.5.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-08-12 06:00:52 +00:00
dependabot[bot]
b3775e2feb
⬆️ Bump @docusaurus/module-type-aliases from 3.4.0 to 3.5.1 in /website (#5400)
Bumps [@docusaurus/module-type-aliases](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases) from 3.4.0 to 3.5.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/module-type-aliases</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.5.1 (2024-08-09)</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10384">#10384</a> fix(core): algolia context import (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10382">#10382</a> fix(theme-algolia): useDocusaurusContext import error (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
</ul>
</li>
</ul>
<h4>Committers: 2</h4>
<ul>
<li>Sunghyun Cho (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
<li>Sébastien Lorber (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
<h2>3.5.0 (2024-08-09)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10375">#10375</a> feat(blog): add <code>onUntruncatedBlogPosts</code> blog options (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10376">#10376</a> feat(theme): show unlisted/draft banners in dev mode (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9252">#9252</a> feat(blog): add feed xlst options to render beautiful RSS and Atom feeds (<a href="https://github.com/Xebec19"><code>@​Xebec19</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10216">#10216</a> feat(blog): authors page (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10339">#10339</a> feat(translation): add Estonian default translation (<a href="https://github.com/chirbard"><code>@​chirbard</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10325">#10325</a> feat(translations): Indonesian translation (<a href="https://github.com/priyadi"><code>@​priyadi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10335">#10335</a> feat(mdx-loader): wrap mdx content title (<code># Title</code>) in <code>&lt;header&gt;</code> for concistency (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10222">#10222</a> feat(blog): author header social icons (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-google-analytics</code>, <code>docusaurus-plugin-google-gtag</code>, <code>docusaurus-plugin-google-tag-manager</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-plugin-vercel-analytics</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10286">#10286</a> feat(core): allow plugins to self-disable by returning null (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10252">#10252</a> feat(blog): group sidebar items by year (<code>themeConfig.blog.sidebar.groupByYear</code>) (<a href="https://github.com/alicelovescake"><code>@​alicelovescake</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10224">#10224</a> feat(blog): warn duplicate and inline authors (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils-validation</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10241">#10241</a> feat(mdx): support recma plugins (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10344">#10344</a> fix(translations): fix wrong Estonian (et) translations and typos (<a href="https://github.com/Gekd"><code>@​Gekd</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10360">#10360</a> fix(translations): Fix and Improve Spanish translations (<a href="https://github.com/sergioalmela"><code>@​sergioalmela</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10235">#10235</a> fix(theme-translation): add missing German (de) theme.admonition translations (<a href="https://github.com/franzd1"><code>@​franzd1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10342">#10342</a> fix(search): fix algolia search ignore ctrl + F in search input (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/module-type-aliases</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.5.1 (2024-08-09)</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10384">#10384</a> fix(core): algolia context import (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10382">#10382</a> fix(theme-algolia): useDocusaurusContext import error (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
</ul>
</li>
</ul>
<h4>Committers: 2</h4>
<ul>
<li>Sunghyun Cho (<a href="https://github.com/anaclumos"><code>@​anaclumos</code></a>)</li>
<li>Sébastien Lorber (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
<h2>3.5.0 (2024-08-09)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10375">#10375</a> feat(blog): add <code>onUntruncatedBlogPosts</code> blog options (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10376">#10376</a> feat(theme): show unlisted/draft banners in dev mode (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9252">#9252</a> feat(blog): add feed xlst options to render beautiful RSS and Atom feeds (<a href="https://github.com/Xebec19"><code>@​Xebec19</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-theme-translations</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10216">#10216</a> feat(blog): authors page (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10339">#10339</a> feat(translation): add Estonian default translation (<a href="https://github.com/chirbard"><code>@​chirbard</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10325">#10325</a> feat(translations): Indonesian translation (<a href="https://github.com/priyadi"><code>@​priyadi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10335">#10335</a> feat(mdx-loader): wrap mdx content title (<code># Title</code>) in <code>&lt;header&gt;</code> for concistency (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10222">#10222</a> feat(blog): author header social icons (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-google-analytics</code>, <code>docusaurus-plugin-google-gtag</code>, <code>docusaurus-plugin-google-tag-manager</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-plugin-vercel-analytics</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10286">#10286</a> feat(core): allow plugins to self-disable by returning null (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10252">#10252</a> feat(blog): group sidebar items by year (<code>themeConfig.blog.sidebar.groupByYear</code>) (<a href="https://github.com/alicelovescake"><code>@​alicelovescake</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10224">#10224</a> feat(blog): warn duplicate and inline authors (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils-validation</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10241">#10241</a> feat(mdx): support recma plugins (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10344">#10344</a> fix(translations): fix wrong Estonian (et) translations and typos (<a href="https://github.com/Gekd"><code>@​Gekd</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10360">#10360</a> fix(translations): Fix and Improve Spanish translations (<a href="https://github.com/sergioalmela"><code>@​sergioalmela</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10235">#10235</a> fix(theme-translation): add missing German (de) theme.admonition translations (<a href="https://github.com/franzd1"><code>@​franzd1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10342">#10342</a> fix(search): fix algolia search ignore ctrl + F in search input (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="5acbc57bd6"><code>5acbc57</code></a> v3.5.1</li>
<li><a href="daa6b87f24"><code>daa6b87</code></a> chore: release Docusaurus v3.5 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/10379">#10379</a>)</li>
<li><a href="dbdd4dfb2e"><code>dbdd4df</code></a> chore: release Docusaurus v3.4 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/10186">#10186</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.5.1/packages/docusaurus-module-type-aliases">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/module-type-aliases&package-manager=npm_and_yarn&previous-version=3.4.0&new-version=3.5.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-08-12 05:51:21 +00:00
dependabot[bot]
4fc5b5b146
⬆️ Bump @docusaurus/plugin-google-gtag from 3.3.2 to 3.4.0 in /website (#5347)
Bumps [@docusaurus/plugin-google-gtag](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag) from 3.3.2 to 3.4.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/plugin-google-gtag</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.4.0 (2024-05-31)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10137">#10137</a> feat(docs, blog): add support for <code>tags.yml</code>, predefined list of tags (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10151">#10151</a> feat(theme-translations): Added Turkmen (tk) default theme translations (<a href="https://github.com/ilmedova"><code>@​ilmedova</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10111">#10111</a> feat(theme-translations): Add Bulgarian default theme translations (bg) (<a href="https://github.com/PetarMc1"><code>@​PetarMc1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-theme-search-algolia</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9859">#9859</a> feat(core): hash router option - browse site offline (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10121">#10121</a> feat(core): site storage config options (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10185">#10185</a> fix(docs, blog): Markdown link resolution does not support hot reload (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10178">#10178</a> fix(theme): SearchPage should respect <code>contextualSearch: false</code> setting (<a href="https://github.com/ncoughlin"><code>@​ncoughlin</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10164">#10164</a> fix(search): fix algolia search container bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10168">#10168</a> fix(mdx-loader): resolve Markdown/MDX links with Remark instead of RegExp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10165">#10165</a> fix(theme-translation): add missing Korean (ko) theme translations (<a href="https://github.com/revi"><code>@​revi</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10157">#10157</a> fix(theme-translations): complete Vietnamese theme translations (<a href="https://github.com/namnguyenthanhwork"><code>@​namnguyenthanhwork</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10145">#10145</a> fix(core): fix serve workaround regexp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10142">#10142</a> fix(core): fix <code>docusaurus serve</code> broken for assets when using trailingSlash (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10130">#10130</a> fix(core): the broken anchor checker should not be sensitive pathname trailing slashes (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10144">#10144</a> fix(theme): fix announcement bar layout shift due to missing storage key namespace (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10132">#10132</a> fix(core): <code>configurePostCss()</code> should run after <code>configureWebpack()</code> (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10131">#10131</a> fix(core): codegen should generate unique route prop filenames (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10118">#10118</a> fix(theme-translations): fix missing pluralization for label DocCard.categoryDescription.plurals (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10176">#10176</a> docs: add community plugin docusaurus-graph (<a href="https://github.com/Arsero"><code>@​Arsero</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10173">#10173</a> docs: improve how to use <code>&lt;details&gt;</code> (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10167">#10167</a> docs: suggest using <code>{&lt;...&gt;...&lt;/...&gt;}</code> if don't use Markdown in migra… (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10143">#10143</a> docs: recommend users to remove hast-util-is-element in migration to v3 (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10124">#10124</a> docs: v3 prepare your site blog post should point users to the upgrade guide (<a href="https://github.com/homotechsual"><code>@​homotechsual</code></a>)</li>
</ul>
<h4>🤖 Dependencies</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10155">#10155</a> chore(deps): bump peaceiris/actions-gh-pages from 3 to 4 (<a href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/plugin-google-gtag</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.4.0 (2024-05-31)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10137">#10137</a> feat(docs, blog): add support for <code>tags.yml</code>, predefined list of tags (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10151">#10151</a> feat(theme-translations): Added Turkmen (tk) default theme translations (<a href="https://github.com/ilmedova"><code>@​ilmedova</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10111">#10111</a> feat(theme-translations): Add Bulgarian default theme translations (bg) (<a href="https://github.com/PetarMc1"><code>@​PetarMc1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-theme-search-algolia</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9859">#9859</a> feat(core): hash router option - browse site offline (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10121">#10121</a> feat(core): site storage config options (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10185">#10185</a> fix(docs, blog): Markdown link resolution does not support hot reload (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10178">#10178</a> fix(theme): SearchPage should respect <code>contextualSearch: false</code> setting (<a href="https://github.com/ncoughlin"><code>@​ncoughlin</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10164">#10164</a> fix(search): fix algolia search container bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10168">#10168</a> fix(mdx-loader): resolve Markdown/MDX links with Remark instead of RegExp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10165">#10165</a> fix(theme-translation): add missing Korean (ko) theme translations (<a href="https://github.com/revi"><code>@​revi</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10157">#10157</a> fix(theme-translations): complete Vietnamese theme translations (<a href="https://github.com/namnguyenthanhwork"><code>@​namnguyenthanhwork</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10145">#10145</a> fix(core): fix serve workaround regexp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10142">#10142</a> fix(core): fix <code>docusaurus serve</code> broken for assets when using trailingSlash (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10130">#10130</a> fix(core): the broken anchor checker should not be sensitive pathname trailing slashes (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10144">#10144</a> fix(theme): fix announcement bar layout shift due to missing storage key namespace (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10132">#10132</a> fix(core): <code>configurePostCss()</code> should run after <code>configureWebpack()</code> (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10131">#10131</a> fix(core): codegen should generate unique route prop filenames (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10118">#10118</a> fix(theme-translations): fix missing pluralization for label DocCard.categoryDescription.plurals (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10176">#10176</a> docs: add community plugin docusaurus-graph (<a href="https://github.com/Arsero"><code>@​Arsero</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10173">#10173</a> docs: improve how to use <code>&lt;details&gt;</code> (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10167">#10167</a> docs: suggest using <code>{&lt;...&gt;...&lt;/...&gt;}</code> if don't use Markdown in migra… (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10143">#10143</a> docs: recommend users to remove hast-util-is-element in migration to v3 (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10124">#10124</a> docs: v3 prepare your site blog post should point users to the upgrade guide (<a href="https://github.com/homotechsual"><code>@​homotechsual</code></a>)</li>
</ul>
<h4>🤖 Dependencies</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10155">#10155</a> chore(deps): bump peaceiris/actions-gh-pages from 3 to 4 (<a href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="49e9a21432"><code>49e9a21</code></a> v3.4.0</li>
<li><a href="c125f7a272"><code>c125f7a</code></a> chore: release Docusaurus 3.3.0 + 3.3.1 + 3.3.2 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10101">#10101</a>)</li>
<li><a href="6b53d4263d"><code>6b53d42</code></a> misc: make copyUntypedFiles work for watch mode (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7445">#7445</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.4.0/packages/docusaurus-plugin-google-gtag">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/plugin-google-gtag&package-manager=npm_and_yarn&previous-version=3.3.2&new-version=3.4.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-06-03 05:46:49 +00:00
dependabot[bot]
0fe2588e78
⬆️ Bump @docusaurus/module-type-aliases from 3.3.2 to 3.4.0 in /website (#5346)
Bumps [@docusaurus/module-type-aliases](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases) from 3.3.2 to 3.4.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/module-type-aliases</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.4.0 (2024-05-31)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10137">#10137</a> feat(docs, blog): add support for <code>tags.yml</code>, predefined list of tags (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10151">#10151</a> feat(theme-translations): Added Turkmen (tk) default theme translations (<a href="https://github.com/ilmedova"><code>@​ilmedova</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10111">#10111</a> feat(theme-translations): Add Bulgarian default theme translations (bg) (<a href="https://github.com/PetarMc1"><code>@​PetarMc1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-theme-search-algolia</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9859">#9859</a> feat(core): hash router option - browse site offline (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10121">#10121</a> feat(core): site storage config options (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10185">#10185</a> fix(docs, blog): Markdown link resolution does not support hot reload (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10178">#10178</a> fix(theme): SearchPage should respect <code>contextualSearch: false</code> setting (<a href="https://github.com/ncoughlin"><code>@​ncoughlin</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10164">#10164</a> fix(search): fix algolia search container bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10168">#10168</a> fix(mdx-loader): resolve Markdown/MDX links with Remark instead of RegExp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10165">#10165</a> fix(theme-translation): add missing Korean (ko) theme translations (<a href="https://github.com/revi"><code>@​revi</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10157">#10157</a> fix(theme-translations): complete Vietnamese theme translations (<a href="https://github.com/namnguyenthanhwork"><code>@​namnguyenthanhwork</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10145">#10145</a> fix(core): fix serve workaround regexp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10142">#10142</a> fix(core): fix <code>docusaurus serve</code> broken for assets when using trailingSlash (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10130">#10130</a> fix(core): the broken anchor checker should not be sensitive pathname trailing slashes (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10144">#10144</a> fix(theme): fix announcement bar layout shift due to missing storage key namespace (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10132">#10132</a> fix(core): <code>configurePostCss()</code> should run after <code>configureWebpack()</code> (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10131">#10131</a> fix(core): codegen should generate unique route prop filenames (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10118">#10118</a> fix(theme-translations): fix missing pluralization for label DocCard.categoryDescription.plurals (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10176">#10176</a> docs: add community plugin docusaurus-graph (<a href="https://github.com/Arsero"><code>@​Arsero</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10173">#10173</a> docs: improve how to use <code>&lt;details&gt;</code> (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10167">#10167</a> docs: suggest using <code>{&lt;...&gt;...&lt;/...&gt;}</code> if don't use Markdown in migra… (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10143">#10143</a> docs: recommend users to remove hast-util-is-element in migration to v3 (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10124">#10124</a> docs: v3 prepare your site blog post should point users to the upgrade guide (<a href="https://github.com/homotechsual"><code>@​homotechsual</code></a>)</li>
</ul>
<h4>🤖 Dependencies</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10155">#10155</a> chore(deps): bump peaceiris/actions-gh-pages from 3 to 4 (<a href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/module-type-aliases</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.4.0 (2024-05-31)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>create-docusaurus</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10137">#10137</a> feat(docs, blog): add support for <code>tags.yml</code>, predefined list of tags (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10151">#10151</a> feat(theme-translations): Added Turkmen (tk) default theme translations (<a href="https://github.com/ilmedova"><code>@​ilmedova</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10111">#10111</a> feat(theme-translations): Add Bulgarian default theme translations (bg) (<a href="https://github.com/PetarMc1"><code>@​PetarMc1</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-client-redirects</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-pwa</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-theme-search-algolia</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9859">#9859</a> feat(core): hash router option - browse site offline (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10121">#10121</a> feat(core): site storage config options (experimental) (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10185">#10185</a> fix(docs, blog): Markdown link resolution does not support hot reload (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10178">#10178</a> fix(theme): SearchPage should respect <code>contextualSearch: false</code> setting (<a href="https://github.com/ncoughlin"><code>@​ncoughlin</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10164">#10164</a> fix(search): fix algolia search container bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10168">#10168</a> fix(mdx-loader): resolve Markdown/MDX links with Remark instead of RegExp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10165">#10165</a> fix(theme-translation): add missing Korean (ko) theme translations (<a href="https://github.com/revi"><code>@​revi</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10157">#10157</a> fix(theme-translations): complete Vietnamese theme translations (<a href="https://github.com/namnguyenthanhwork"><code>@​namnguyenthanhwork</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10145">#10145</a> fix(core): fix serve workaround regexp (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10142">#10142</a> fix(core): fix <code>docusaurus serve</code> broken for assets when using trailingSlash (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10130">#10130</a> fix(core): the broken anchor checker should not be sensitive pathname trailing slashes (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10144">#10144</a> fix(theme): fix announcement bar layout shift due to missing storage key namespace (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10132">#10132</a> fix(core): <code>configurePostCss()</code> should run after <code>configureWebpack()</code> (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10131">#10131</a> fix(core): codegen should generate unique route prop filenames (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10118">#10118</a> fix(theme-translations): fix missing pluralization for label DocCard.categoryDescription.plurals (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10176">#10176</a> docs: add community plugin docusaurus-graph (<a href="https://github.com/Arsero"><code>@​Arsero</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10173">#10173</a> docs: improve how to use <code>&lt;details&gt;</code> (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10167">#10167</a> docs: suggest using <code>{&lt;...&gt;...&lt;/...&gt;}</code> if don't use Markdown in migra… (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10143">#10143</a> docs: recommend users to remove hast-util-is-element in migration to v3 (<a href="https://github.com/tats-u"><code>@​tats-u</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10124">#10124</a> docs: v3 prepare your site blog post should point users to the upgrade guide (<a href="https://github.com/homotechsual"><code>@​homotechsual</code></a>)</li>
</ul>
<h4>🤖 Dependencies</h4>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10155">#10155</a> chore(deps): bump peaceiris/actions-gh-pages from 3 to 4 (<a href="https://github.com/apps/dependabot"><code>@​dependabot[bot]</code></a>)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="49e9a21432"><code>49e9a21</code></a> v3.4.0</li>
<li><a href="620e46350a"><code>620e463</code></a> feat(core): site storage config options (experimental) (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/10121">#10121</a>)</li>
<li><a href="c125f7a272"><code>c125f7a</code></a> chore: release Docusaurus 3.3.0 + 3.3.1 + 3.3.2 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/10101">#10101</a>)</li>
<li><a href="53564f33ab"><code>53564f3</code></a> refactor(core): prefetch/preload refactor (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/7282">#7282</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.4.0/packages/docusaurus-module-type-aliases">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/module-type-aliases&package-manager=npm_and_yarn&previous-version=3.3.2&new-version=3.4.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-06-03 05:41:31 +00:00
Orhan Tozan
d9d993d267
docs: improve definition of a Backup (#5316)
After some building, I realized that a backup is a snapshot of a
resource, not the whole m365 service. Initially, I assumed that 1 backup
takes the whole service per tenant
(https://discord.com/channels/1022200980487557130/1022200981376745474/1231385151376719892).
This doc update should help clear the confusion more.

Maybe there is a better way to word it, so any other suggestions are
welcome.

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [x] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E

---------

Co-authored-by: ashmrtn <3891298+ashmrtn@users.noreply.github.com>
2024-05-22 08:57:05 -07:00
dependabot[bot]
1b842a1c60
⬆️ Bump sass from 1.76.0 to 1.77.0 in /website (#5324)
Bumps [sass](https://github.com/sass/dart-sass) from 1.76.0 to 1.77.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.77.0</h2>
<p>To install Sass 1.77.0, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li><em>Don't</em> throw errors for at-rules in keyframe blocks.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1770">full changelog</a> for changes in earlier releases.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.77.0</h2>
<ul>
<li><em>Don't</em> throw errors for at-rules in keyframe blocks.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="85f39d5ad7"><code>85f39d5</code></a> Allow at-rules in <code>@keyframes</code> blocks (<a href="https://redirect.github.com/sass/dart-sass/issues/2236">#2236</a>)</li>
<li>See full diff in <a href="https://github.com/sass/dart-sass/compare/1.76.0...1.77.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.76.0&new-version=1.77.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-05-07 05:36:17 +00:00
dependabot[bot]
64de1d9e17
⬆️ Bump @docusaurus/plugin-google-gtag from 3.2.0 to 3.3.2 in /website (#5320)
Bumps [@docusaurus/plugin-google-gtag](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag) from 3.2.0 to 3.3.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/plugin-google-gtag</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.3.0 (2024-05-03)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-sitemap</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10083">#10083</a> feat: add createSitemapItems hook (<a href="https://github.com/johnnyreilly"><code>@​johnnyreilly</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10064">#10064</a> feat(core): add new site config option <code>siteConfig.markdown.anchors.maintainCase</code> (<a href="https://github.com/iAdramelk"><code>@​iAdramelk</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9767">#9767</a> feat(cli): docusaurus deploy should support a --target-dir option (<a href="https://github.com/SandPod"><code>@​SandPod</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10042">#10042</a> feat(core): simplify plugin API, support route.props (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-pages</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10032">#10032</a> feat(pages): add LastUpdateAuthor &amp; LastUpdateTime &amp; editUrl (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-cssnano-preset</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10092">#10092</a> chore: Upgrade svgr / svgo / cssnano (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10091">#10091</a> fix(theme): <code>&lt;Tabs&gt;</code> props should allow overriding defaults (<a href="https://github.com/gagdiez"><code>@​gagdiez</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10080">#10080</a> fix(theme): <code>&lt;Admonition&gt;</code> should render properly without heading/icon (<a href="https://github.com/andrmaz"><code>@​andrmaz</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10090">#10090</a> fix(core): <code>docusaurus serve</code> redirects should include the site <code>/baseUrl/</code> prefix (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-preset-classic</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-live-codeblock</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10079">#10079</a> fix: handle React v18.3 warnings (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10070">#10070</a> fix(theme-translations): add missing theme translations for pt-BR (<a href="https://github.com/h3nr1ke"><code>@​h3nr1ke</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10051">#10051</a> fix(theme-translations): correct label for tip admonition in italian (<a href="https://github.com/tomsotte"><code>@​tomsotte</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10048">#10048</a> fix(algolia): add insights property on Algolia Theme Config object TS definition (<a href="https://github.com/Virgil993"><code>@​Virgil993</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10054">#10054</a> fix(core): sortRoutes shouldn't have a default baseUrl value, this led to a bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10025">#10025</a> fix(docs): sidebar item label impact the pagination label of docs (<a href="https://github.com/Abdullah-03"><code>@​Abdullah-03</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10022">#10022</a> fix(utils): getFileCommitDate should support <code>log.showSignature=true</code> (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🏃‍♀️ Performance</h4>
<ul>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10060">#10060</a> refactor(core): optimize App entrypoint, it should not re-render when navigating (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>💅 Polish</h4>
<ul>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10061">#10061</a> refactor(theme): simplify CSS solution to solve empty search container (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10023">#10023</a> refactor(website): refactor showcase components (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/plugin-google-gtag</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.3.2 (2024-05-03)</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10103">#10103</a> fix(core): do not recreate ReactDOM Root, fix React warning on hot reload (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>Committers: 1</h4>
<ul>
<li>Sébastien Lorber (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
<h2>3.3.1 (2024-05-03)</h2>
<p>Failed release</p>
<h2>3.3.0 (2024-05-03)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-sitemap</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10083">#10083</a> feat: add createSitemapItems hook (<a href="https://github.com/johnnyreilly"><code>@​johnnyreilly</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10064">#10064</a> feat(core): add new site config option <code>siteConfig.markdown.anchors.maintainCase</code> (<a href="https://github.com/iAdramelk"><code>@​iAdramelk</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9767">#9767</a> feat(cli): docusaurus deploy should support a --target-dir option (<a href="https://github.com/SandPod"><code>@​SandPod</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10042">#10042</a> feat(core): simplify plugin API, support route.props (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-pages</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10032">#10032</a> feat(pages): add LastUpdateAuthor &amp; LastUpdateTime &amp; editUrl (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-cssnano-preset</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10092">#10092</a> chore: Upgrade svgr / svgo / cssnano (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10091">#10091</a> fix(theme): <code>&lt;Tabs&gt;</code> props should allow overriding defaults (<a href="https://github.com/gagdiez"><code>@​gagdiez</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10080">#10080</a> fix(theme): <code>&lt;Admonition&gt;</code> should render properly without heading/icon (<a href="https://github.com/andrmaz"><code>@​andrmaz</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10090">#10090</a> fix(core): <code>docusaurus serve</code> redirects should include the site <code>/baseUrl/</code> prefix (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-preset-classic</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-live-codeblock</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10079">#10079</a> fix: handle React v18.3 warnings (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10070">#10070</a> fix(theme-translations): add missing theme translations for pt-BR (<a href="https://github.com/h3nr1ke"><code>@​h3nr1ke</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10051">#10051</a> fix(theme-translations): correct label for tip admonition in italian (<a href="https://github.com/tomsotte"><code>@​tomsotte</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10048">#10048</a> fix(algolia): add insights property on Algolia Theme Config object TS definition (<a href="https://github.com/Virgil993"><code>@​Virgil993</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10054">#10054</a> fix(core): sortRoutes shouldn't have a default baseUrl value, this led to a bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10025">#10025</a> fix(docs): sidebar item label impact the pagination label of docs (<a href="https://github.com/Abdullah-03"><code>@​Abdullah-03</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="bc638d674b"><code>bc638d6</code></a> v3.3.2</li>
<li><a href="f3524cf332"><code>f3524cf</code></a> v3.3.1</li>
<li><a href="2ec4e078b5"><code>2ec4e07</code></a> v3.3.0</li>
<li><a href="f88da6c66d"><code>f88da6c</code></a> refactor: extract base TS client config + upgrade TS + refactor TS setup (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10">#10</a>...</li>
<li><a href="e012e03158"><code>e012e03</code></a> chore: release Docusaurus 3.2.1 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10016">#10016</a>)</li>
<li><a href="debfc87d34"><code>debfc87</code></a> chore: release Docusaurus v3.2.0 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/10000">#10000</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.3.2/packages/docusaurus-plugin-google-gtag">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/plugin-google-gtag&package-manager=npm_and_yarn&previous-version=3.2.0&new-version=3.3.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-05-06 06:17:05 +00:00
dependabot[bot]
48c0ab5175
⬆️ Bump @docusaurus/module-type-aliases from 3.2.0 to 3.3.2 in /website (#5317)
Bumps [@docusaurus/module-type-aliases](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases) from 3.2.0 to 3.3.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/module-type-aliases</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.3.0 (2024-05-03)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-sitemap</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10083">#10083</a> feat: add createSitemapItems hook (<a href="https://github.com/johnnyreilly"><code>@​johnnyreilly</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10064">#10064</a> feat(core): add new site config option <code>siteConfig.markdown.anchors.maintainCase</code> (<a href="https://github.com/iAdramelk"><code>@​iAdramelk</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9767">#9767</a> feat(cli): docusaurus deploy should support a --target-dir option (<a href="https://github.com/SandPod"><code>@​SandPod</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10042">#10042</a> feat(core): simplify plugin API, support route.props (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-pages</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10032">#10032</a> feat(pages): add LastUpdateAuthor &amp; LastUpdateTime &amp; editUrl (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-cssnano-preset</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10092">#10092</a> chore: Upgrade svgr / svgo / cssnano (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10091">#10091</a> fix(theme): <code>&lt;Tabs&gt;</code> props should allow overriding defaults (<a href="https://github.com/gagdiez"><code>@​gagdiez</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10080">#10080</a> fix(theme): <code>&lt;Admonition&gt;</code> should render properly without heading/icon (<a href="https://github.com/andrmaz"><code>@​andrmaz</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10090">#10090</a> fix(core): <code>docusaurus serve</code> redirects should include the site <code>/baseUrl/</code> prefix (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-preset-classic</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-live-codeblock</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10079">#10079</a> fix: handle React v18.3 warnings (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10070">#10070</a> fix(theme-translations): add missing theme translations for pt-BR (<a href="https://github.com/h3nr1ke"><code>@​h3nr1ke</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10051">#10051</a> fix(theme-translations): correct label for tip admonition in italian (<a href="https://github.com/tomsotte"><code>@​tomsotte</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10048">#10048</a> fix(algolia): add insights property on Algolia Theme Config object TS definition (<a href="https://github.com/Virgil993"><code>@​Virgil993</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10054">#10054</a> fix(core): sortRoutes shouldn't have a default baseUrl value, this led to a bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10025">#10025</a> fix(docs): sidebar item label impact the pagination label of docs (<a href="https://github.com/Abdullah-03"><code>@​Abdullah-03</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10022">#10022</a> fix(utils): getFileCommitDate should support <code>log.showSignature=true</code> (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>🏃‍♀️ Performance</h4>
<ul>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10060">#10060</a> refactor(core): optimize App entrypoint, it should not re-render when navigating (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>💅 Polish</h4>
<ul>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10061">#10061</a> refactor(theme): simplify CSS solution to solve empty search container (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10023">#10023</a> refactor(website): refactor showcase components (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/module-type-aliases</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.3.2 (2024-05-03)</h2>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10103">#10103</a> fix(core): do not recreate ReactDOM Root, fix React warning on hot reload (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
</ul>
<h4>Committers: 1</h4>
<ul>
<li>Sébastien Lorber (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
<h2>3.3.1 (2024-05-03)</h2>
<p>Failed release</p>
<h2>3.3.0 (2024-05-03)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-sitemap</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10083">#10083</a> feat: add createSitemapItems hook (<a href="https://github.com/johnnyreilly"><code>@​johnnyreilly</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10064">#10064</a> feat(core): add new site config option <code>siteConfig.markdown.anchors.maintainCase</code> (<a href="https://github.com/iAdramelk"><code>@​iAdramelk</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9767">#9767</a> feat(cli): docusaurus deploy should support a --target-dir option (<a href="https://github.com/SandPod"><code>@​SandPod</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10042">#10042</a> feat(core): simplify plugin API, support route.props (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-pages</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10032">#10032</a> feat(pages): add LastUpdateAuthor &amp; LastUpdateTime &amp; editUrl (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-cssnano-preset</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10092">#10092</a> chore: Upgrade svgr / svgo / cssnano (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10091">#10091</a> fix(theme): <code>&lt;Tabs&gt;</code> props should allow overriding defaults (<a href="https://github.com/gagdiez"><code>@​gagdiez</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10080">#10080</a> fix(theme): <code>&lt;Admonition&gt;</code> should render properly without heading/icon (<a href="https://github.com/andrmaz"><code>@​andrmaz</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10090">#10090</a> fix(core): <code>docusaurus serve</code> redirects should include the site <code>/baseUrl/</code> prefix (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-module-type-aliases</code>, <code>docusaurus-preset-classic</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-live-codeblock</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10079">#10079</a> fix: handle React v18.3 warnings (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10070">#10070</a> fix(theme-translations): add missing theme translations for pt-BR (<a href="https://github.com/h3nr1ke"><code>@​h3nr1ke</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10051">#10051</a> fix(theme-translations): correct label for tip admonition in italian (<a href="https://github.com/tomsotte"><code>@​tomsotte</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10048">#10048</a> fix(algolia): add insights property on Algolia Theme Config object TS definition (<a href="https://github.com/Virgil993"><code>@​Virgil993</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10054">#10054</a> fix(core): sortRoutes shouldn't have a default baseUrl value, this led to a bug (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-docs</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/10025">#10025</a> fix(docs): sidebar item label impact the pagination label of docs (<a href="https://github.com/Abdullah-03"><code>@​Abdullah-03</code></a>)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="bc638d674b"><code>bc638d6</code></a> v3.3.2</li>
<li><a href="f3524cf332"><code>f3524cf</code></a> v3.3.1</li>
<li><a href="3490433f94"><code>3490433</code></a> Merge branch 'main' into slorber/docusaurus-v3.3</li>
<li><a href="2d8281fc03"><code>2d8281f</code></a> fix(core): do not recreate ReactDOM Root, fix React warning on hot reload (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/1">#1</a>...</li>
<li><a href="2ec4e078b5"><code>2ec4e07</code></a> v3.3.0</li>
<li><a href="ca33858ca0"><code>ca33858</code></a> fix: handle React v18.3 warnings (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/10079">#10079</a>)</li>
<li><a href="e012e03158"><code>e012e03</code></a> chore: release Docusaurus 3.2.1 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/10016">#10016</a>)</li>
<li><a href="debfc87d34"><code>debfc87</code></a> chore: release Docusaurus v3.2.0 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/10000">#10000</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.3.2/packages/docusaurus-module-type-aliases">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/module-type-aliases&package-manager=npm_and_yarn&previous-version=3.2.0&new-version=3.3.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-05-06 06:06:58 +00:00
dependabot[bot]
eb3ab3aebc
⬆️ Bump sass from 1.75.0 to 1.76.0 in /website (#5313)
Bumps [sass](https://github.com/sass/dart-sass) from 1.75.0 to 1.76.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.76.0</h2>
<p>To install Sass 1.76.0, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li>
<p>Throw errors for misplaced statements in keyframe blocks.</p>
</li>
<li>
<p>Mixins and functions whose names begin with <code>--</code> are now deprecated for forwards-compatibility with the in-progress CSS functions and mixins spec. This deprecation is named <code>css-function-mixin</code>.</p>
</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1760">full changelog</a> for changes in earlier releases.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.76.0</h2>
<ul>
<li>
<p>Throw errors for misplaced statements in keyframe blocks.</p>
</li>
<li>
<p>Mixins and functions whose names begin with <code>--</code> are now deprecated for
forwards-compatibility with the in-progress CSS functions and mixins spec.
This deprecation is named <code>css-function-mixin</code>.</p>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="264b2d58b0"><code>264b2d5</code></a> Deprecate function and mixin names beginning with <code>--</code> (<a href="https://redirect.github.com/sass/dart-sass/issues/2230">#2230</a>)</li>
<li><a href="f145e1c11b"><code>f145e1c</code></a> Throw errors for misplaced statements in keyframe blocks (<a href="https://redirect.github.com/sass/dart-sass/issues/2226">#2226</a>)</li>
<li><a href="eafc279ae7"><code>eafc279</code></a> Explicitly add a breaking change exemption for invalid CSS output (<a href="https://redirect.github.com/sass/dart-sass/issues/2225">#2225</a>)</li>
<li><a href="b97f26f71f"><code>b97f26f</code></a> Add a per-importer cache for loads that aren't cacheable en masse (<a href="https://redirect.github.com/sass/dart-sass/issues/2219">#2219</a>)</li>
<li><a href="2a9eaadefa"><code>2a9eaad</code></a> Implement access tracking for containingUrl (<a href="https://redirect.github.com/sass/dart-sass/issues/2220">#2220</a>)</li>
<li>See full diff in <a href="https://github.com/sass/dart-sass/compare/1.75.0...1.76.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.75.0&new-version=1.76.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-05-01 05:12:33 +00:00
dependabot[bot]
df423d5e18
⬆️ Bump react-dom from 18.2.0 to 18.3.0 in /website (#5307)
Bumps [react-dom](https://github.com/facebook/react/tree/HEAD/packages/react-dom) from 18.2.0 to 18.3.0.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a href="https://github.com/facebook/react/commits/HEAD/packages/react-dom">compare view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a href="https://www.npmjs.com/~react-bot">react-bot</a>, a new releaser for react-dom since your current version.</p>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=react-dom&package-manager=npm_and_yarn&previous-version=18.2.0&new-version=18.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-04-26 05:17:28 +00:00
Abhishek Pandey
23de1d53dd
Skip more conv tests (#5302)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-04-23 05:38:51 +00:00
Abhishek Pandey
963dd4a11d
Disable conversations integ tests (#5299)
<!-- PR description-->
`CorsoCITeam` group mailbox backup is currently broken because of invalid `odata.NextLink` which causes an infinite loop during paging. Disabling conv backups while we go fix the impacted group mailbox.



---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-04-18 05:51:21 +00:00
dependabot[bot]
e96f74e634
⬆️ Bump sass from 1.74.1 to 1.75.0 in /website (#5297)
Bumps [sass](https://github.com/sass/dart-sass) from 1.74.1 to 1.75.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.75.0</h2>
<p>To install Sass 1.75.0, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li>Fix a bug in which stylesheet canonicalization could be cached incorrectly when custom importers or the Node.js package importer made decisions based on the URL of the containing stylesheet.</li>
</ul>
<h3>JS API</h3>
<ul>
<li>Allow <code>importer</code> to be passed without <code>url</code> in <code>StringOptionsWithImporter</code>.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1750">full changelog</a> for changes in earlier releases.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.75.0</h2>
<ul>
<li>Fix a bug in which stylesheet canonicalization could be cached incorrectly
when custom importers or the Node.js package importer made decisions based on
the URL of the containing stylesheet.</li>
</ul>
<h3>JS API</h3>
<ul>
<li>Allow <code>importer</code> to be passed without <code>url</code> in <code>StringOptionsWithImporter</code>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="821b98e26c"><code>821b98e</code></a> Don't cache canonicalize calls when <code>containingUrl</code> is available (<a href="https://redirect.github.com/sass/dart-sass/issues/2215">#2215</a>)</li>
<li><a href="c5aff1b2f2"><code>c5aff1b</code></a> Make it possible to build npm with a linked language repo (<a href="https://redirect.github.com/sass/dart-sass/issues/2214">#2214</a>)</li>
<li>See full diff in <a href="https://github.com/sass/dart-sass/compare/1.74.1...1.75.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.74.1&new-version=1.75.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-04-12 05:09:36 +00:00
Keepers
b180dee597
adding retries to purge action powershell scripts (#5294)
#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 💻 CI/Deployment

#### Test Plan

- [x] 💚 E2E
2024-04-09 18:32:37 +00:00
dependabot[bot]
44d4821a8d
⬆️ Bump sass from 1.72.0 to 1.74.1 in /website (#5286)
Bumps [sass](https://github.com/sass/dart-sass) from 1.72.0 to 1.74.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.74.1</h2>
<p>To install Sass 1.74.1, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li>No user-visible changes.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1741">full changelog</a> for changes in earlier releases.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.74.1</h2>
<ul>
<li>No user-visible changes.</li>
</ul>
<h2>1.74.0</h2>
<h3>JS API</h3>
<ul>
<li>
<p>Add a new top-level <code>deprecations</code> object, which contains various
<code>Deprecation</code> objects that define the different types of deprecation used by
the Sass compiler and can be passed to the options below.</p>
</li>
<li>
<p>Add a new <code>fatalDeprecations</code> compiler option that causes the compiler to
error if any deprecation warnings of the provided types are encountered. You
can also pass in a <code>Version</code> object to treat all deprecations that were active
in that Dart Sass version as fatal.</p>
</li>
<li>
<p>Add a new <code>futureDeprecations</code> compiler option that allows you to opt-in to
certain deprecations early (currently just <code>import</code>).</p>
</li>
<li>
<p>Add a new <code>silenceDeprecations</code> compiler option to ignore any deprecation
warnings of the provided types.</p>
</li>
</ul>
<h3>Command-Line Interface</h3>
<ul>
<li>
<p>Add a new <code>--silence-deprecation</code> flag, which causes the compiler to ignore
any deprecation warnings of the provided types.</p>
</li>
<li>
<p>Previously, if a future deprecation was passed to <code>--fatal-deprecation</code> but
not <code>--future-deprecation</code>, it would be treated as fatal despite not being
enabled. Both flags are now required to treat a future deprecation as fatal
with a warning emitted if <code>--fatal-deprecation</code> is passed without
<code>--future-deprecation</code>, matching the JS API's behavior.</p>
</li>
</ul>
<h3>Dart API</h3>
<ul>
<li>
<p>The <code>compile</code> methods now take in a <code>silenceDeprecations</code> parameter, which
causes the compiler to ignore any deprecation warnings of the provided types.</p>
</li>
<li>
<p>Add <code>Deprecation.obsoleteIn</code> to match the JS API. This is currently null for
all deprecations, but will be used once some deprecations become obsolete in
Dart Sass 2.0.0.</p>
</li>
<li>
<p><strong>Potentially breaking bug fix:</strong> Fix a bug where <code>compileStringToResultAsync</code>
ignored <code>fatalDeprecations</code> and <code>futureDeprecations</code>.</p>
</li>
<li>
<p>The behavior around making future deprecations fatal mentioned in the CLI
section above has also been changed in the Dart API.</p>
</li>
</ul>
<h2>1.73.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="1137797f17"><code>1137797</code></a> Fix bulma and release 1.74.1 (<a href="https://redirect.github.com/sass/dart-sass/issues/2210">#2210</a>)</li>
<li><a href="d9220d9c37"><code>d9220d9</code></a> Complete implementation the deprecations API (<a href="https://redirect.github.com/sass/dart-sass/issues/2207">#2207</a>)</li>
<li><a href="783c248d2f"><code>783c248</code></a> Fix typo in function documentation (<a href="https://redirect.github.com/sass/dart-sass/issues/2205">#2205</a>)</li>
<li><a href="c8d064368c"><code>c8d0643</code></a> Better handle filesystem importers when load paths aren't necessary (<a href="https://redirect.github.com/sass/dart-sass/issues/2203">#2203</a>)</li>
<li><a href="9302b3519c"><code>9302b35</code></a> Add support for nesting in plain CSS (<a href="https://redirect.github.com/sass/dart-sass/issues/2198">#2198</a>)</li>
<li><a href="772280a7ff"><code>772280a</code></a> Support linux-riscv64 and windows-arm64 (<a href="https://redirect.github.com/sass/dart-sass/issues/2201">#2201</a>)</li>
<li>See full diff in <a href="https://github.com/sass/dart-sass/compare/1.72.0...1.74.1">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.72.0&new-version=1.74.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-04-04 05:58:44 +00:00
dependabot[bot]
6bbb46b29a
⬆️ Bump @docusaurus/plugin-google-gtag from 3.1.1 to 3.2.0 in /website (#5284)
Bumps [@docusaurus/plugin-google-gtag](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag) from 3.1.1 to 3.2.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/plugin-google-gtag</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.2.0 (2024-03-29)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9954">#9954</a> feat(sitemap): add support for &quot;lastmod&quot; (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9912">#9912</a> feat(blog): add LastUpdateAuthor &amp; LastUpdateTime (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9931">#9931</a> feat(core): add new plugin allContentLoaded lifecycle (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9928">#9928</a> feat(theme-translations) Icelandic (is) (<a href="https://github.com/Hallinn"><code>@​Hallinn</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9886">#9886</a> feat(blog): allow processing blog posts through a processBlogPosts function (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9838">#9838</a> feat(blog): add blog pageBasePath plugin option (<a href="https://github.com/ilg-ul"><code>@​ilg-ul</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9681">#9681</a> feat(swizzle): ask user preferred language if no language CLI option provided (<a href="https://github.com/yixiaojiu"><code>@​yixiaojiu</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9442">#9442</a> feat(create-docusaurus): ask user for preferred language when no language CLI option provided (<a href="https://github.com/Rafael-Martins"><code>@​Rafael-Martins</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-vercel-analytics</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9687">#9687</a> feat(plugin-vercel-analytics): add new vercel analytics plugin (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9684">#9684</a> feat(mdx-loader): the table-of-contents should display toc/headings of imported MDX partials (<a href="https://github.com/anatolykopyl"><code>@​anatolykopyl</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9999">#9999</a> fix(mdx-loader): Ignore contentTitle coming after Markdown thematicBreak (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9945">#9945</a> fix(a11y): move focus algolia-search focus back to search input on Escape (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9920">#9920</a> fix(blog): apply trailing slash to blog feed (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9944">#9944</a> fix(theme): improve a11y of DocSidebarItemCategory expand/collapsed button (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9915">#9915</a> fix(theme-translations): complete and modify Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9910">#9910</a> fix(theme-translations): add Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9872">#9872</a> fix(theme-translations): complete and improve Spanish theme translations (<a href="https://github.com/4troDev"><code>@​4troDev</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9812">#9812</a> fix(i18n): add missing theme translations for fa locale (<a href="https://github.com/VahidNaderi"><code>@​VahidNaderi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9897">#9897</a> fix(mdx-loader): mdx-code-block should support CRLF (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9878">#9878</a> fix(core): fix default i18n calendar used, infer it from locale if possible (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9852">#9852</a> fix(core): ensure core error boundary is able to render theme layout (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-remark-plugin-npm2yarn</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9861">#9861</a> fix(remark-npm2yarn): update npm-to-yarn from 2.0.0 to 2.2.1, fix pnpm extra args syntax (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9851">#9851</a> fix(theme-classic): should use plurals for category items description (<a href="https://github.com/baradusov"><code>@​baradusov</code></a>)</li>
</ul>
</li>
</ul>
<h4>🏃‍♀️ Performance</h4>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/plugin-google-gtag</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.2.0 (2024-03-29)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9954">#9954</a> feat(sitemap): add support for &quot;lastmod&quot; (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9912">#9912</a> feat(blog): add LastUpdateAuthor &amp; LastUpdateTime (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9931">#9931</a> feat(core): add new plugin allContentLoaded lifecycle (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9928">#9928</a> feat(theme-translations) Icelandic (is) (<a href="https://github.com/Hallinn"><code>@​Hallinn</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9886">#9886</a> feat(blog): allow processing blog posts through a processBlogPosts function (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9838">#9838</a> feat(blog): add blog pageBasePath plugin option (<a href="https://github.com/ilg-ul"><code>@​ilg-ul</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9681">#9681</a> feat(swizzle): ask user preferred language if no language CLI option provided (<a href="https://github.com/yixiaojiu"><code>@​yixiaojiu</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9442">#9442</a> feat(create-docusaurus): ask user for preferred language when no language CLI option provided (<a href="https://github.com/Rafael-Martins"><code>@​Rafael-Martins</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-vercel-analytics</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9687">#9687</a> feat(plugin-vercel-analytics): add new vercel analytics plugin (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9684">#9684</a> feat(mdx-loader): the table-of-contents should display toc/headings of imported MDX partials (<a href="https://github.com/anatolykopyl"><code>@​anatolykopyl</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9999">#9999</a> fix(mdx-loader): Ignore contentTitle coming after Markdown thematicBreak (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9945">#9945</a> fix(a11y): move focus algolia-search focus back to search input on Escape (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9920">#9920</a> fix(blog): apply trailing slash to blog feed (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9944">#9944</a> fix(theme): improve a11y of DocSidebarItemCategory expand/collapsed button (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9915">#9915</a> fix(theme-translations): complete and modify Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9910">#9910</a> fix(theme-translations): add Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9872">#9872</a> fix(theme-translations): complete and improve Spanish theme translations (<a href="https://github.com/4troDev"><code>@​4troDev</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9812">#9812</a> fix(i18n): add missing theme translations for fa locale (<a href="https://github.com/VahidNaderi"><code>@​VahidNaderi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9897">#9897</a> fix(mdx-loader): mdx-code-block should support CRLF (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9878">#9878</a> fix(core): fix default i18n calendar used, infer it from locale if possible (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9852">#9852</a> fix(core): ensure core error boundary is able to render theme layout (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-remark-plugin-npm2yarn</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9861">#9861</a> fix(remark-npm2yarn): update npm-to-yarn from 2.0.0 to 2.2.1, fix pnpm extra args syntax (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9851">#9851</a> fix(theme-classic): should use plurals for category items description (<a href="https://github.com/baradusov"><code>@​baradusov</code></a>)</li>
</ul>
</li>
</ul>
<h4>🏃‍♀️ Performance</h4>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="5af143651b"><code>5af1436</code></a> v3.2.0</li>
<li><a href="49ecd8f472"><code>49ecd8f</code></a> fix(gtag): send the newly rendered page's title instead of the old one's (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7424">#7424</a>)</li>
<li><a href="47a2cca17d"><code>47a2cca</code></a> chore: require Node 16.14 (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7501">#7501</a>)</li>
<li><a href="bf1513a3e3"><code>bf1513a</code></a> refactor: fix a lot of errors in type-aware linting (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7477">#7477</a>)</li>
<li><a href="6b53d4263d"><code>6b53d42</code></a> misc: make copyUntypedFiles work for watch mode (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7445">#7445</a>)</li>
<li><a href="a555fd1dcb"><code>a555fd1</code></a> refactor: make each tsconfig explicitly declare module and include/exclude (#...</li>
<li><a href="7613ecb9ea"><code>7613ecb</code></a> refactor: use TS project references instead of running tsc multiple times (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7">#7</a>...</li>
<li><a href="26df8c83ce"><code>26df8c8</code></a> chore: prepare v2.0.0-beta.20 release (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7347">#7347</a>)</li>
<li><a href="6fa51890f0"><code>6fa5189</code></a> chore: prepare v2.0.0-beta.19 release (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-plugin-google-gtag/issues/7325">#7325</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.2.0/packages/docusaurus-plugin-google-gtag">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/plugin-google-gtag&package-manager=npm_and_yarn&previous-version=3.1.1&new-version=3.2.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-04-01 05:42:47 +00:00
dependabot[bot]
f197d7cf7b
⬆️ Bump @docusaurus/module-type-aliases from 3.1.1 to 3.2.0 in /website (#5281)
Bumps [@docusaurus/module-type-aliases](https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases) from 3.1.1 to 3.2.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/releases"><code>@​docusaurus/module-type-aliases</code>'s releases</a>.</em></p>
<blockquote>
<h2>3.2.0 (2024-03-29)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9954">#9954</a> feat(sitemap): add support for &quot;lastmod&quot; (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9912">#9912</a> feat(blog): add LastUpdateAuthor &amp; LastUpdateTime (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9931">#9931</a> feat(core): add new plugin allContentLoaded lifecycle (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9928">#9928</a> feat(theme-translations) Icelandic (is) (<a href="https://github.com/Hallinn"><code>@​Hallinn</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9886">#9886</a> feat(blog): allow processing blog posts through a processBlogPosts function (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9838">#9838</a> feat(blog): add blog pageBasePath plugin option (<a href="https://github.com/ilg-ul"><code>@​ilg-ul</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9681">#9681</a> feat(swizzle): ask user preferred language if no language CLI option provided (<a href="https://github.com/yixiaojiu"><code>@​yixiaojiu</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9442">#9442</a> feat(create-docusaurus): ask user for preferred language when no language CLI option provided (<a href="https://github.com/Rafael-Martins"><code>@​Rafael-Martins</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-vercel-analytics</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9687">#9687</a> feat(plugin-vercel-analytics): add new vercel analytics plugin (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9684">#9684</a> feat(mdx-loader): the table-of-contents should display toc/headings of imported MDX partials (<a href="https://github.com/anatolykopyl"><code>@​anatolykopyl</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9999">#9999</a> fix(mdx-loader): Ignore contentTitle coming after Markdown thematicBreak (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9945">#9945</a> fix(a11y): move focus algolia-search focus back to search input on Escape (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9920">#9920</a> fix(blog): apply trailing slash to blog feed (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9944">#9944</a> fix(theme): improve a11y of DocSidebarItemCategory expand/collapsed button (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9915">#9915</a> fix(theme-translations): complete and modify Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9910">#9910</a> fix(theme-translations): add Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9872">#9872</a> fix(theme-translations): complete and improve Spanish theme translations (<a href="https://github.com/4troDev"><code>@​4troDev</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9812">#9812</a> fix(i18n): add missing theme translations for fa locale (<a href="https://github.com/VahidNaderi"><code>@​VahidNaderi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9897">#9897</a> fix(mdx-loader): mdx-code-block should support CRLF (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9878">#9878</a> fix(core): fix default i18n calendar used, infer it from locale if possible (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9852">#9852</a> fix(core): ensure core error boundary is able to render theme layout (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-remark-plugin-npm2yarn</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9861">#9861</a> fix(remark-npm2yarn): update npm-to-yarn from 2.0.0 to 2.2.1, fix pnpm extra args syntax (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9851">#9851</a> fix(theme-classic): should use plurals for category items description (<a href="https://github.com/baradusov"><code>@​baradusov</code></a>)</li>
</ul>
</li>
</ul>
<h4>🏃‍♀️ Performance</h4>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/facebook/docusaurus/blob/main/CHANGELOG.md"><code>@​docusaurus/module-type-aliases</code>'s changelog</a>.</em></p>
<blockquote>
<h2>3.2.0 (2024-03-29)</h2>
<h4>🚀 New Feature</h4>
<ul>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-plugin-content-pages</code>, <code>docusaurus-plugin-sitemap</code>, <code>docusaurus-types</code>, <code>docusaurus-utils</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9954">#9954</a> feat(sitemap): add support for &quot;lastmod&quot; (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>, <code>docusaurus-plugin-content-docs</code>, <code>docusaurus-theme-classic</code>, <code>docusaurus-theme-common</code>, <code>docusaurus-utils-validation</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9912">#9912</a> feat(blog): add LastUpdateAuthor &amp; LastUpdateTime (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-debug</code>, <code>docusaurus-types</code>, <code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9931">#9931</a> feat(core): add new plugin allContentLoaded lifecycle (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9928">#9928</a> feat(theme-translations) Icelandic (is) (<a href="https://github.com/Hallinn"><code>@​Hallinn</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9886">#9886</a> feat(blog): allow processing blog posts through a processBlogPosts function (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9838">#9838</a> feat(blog): add blog pageBasePath plugin option (<a href="https://github.com/ilg-ul"><code>@​ilg-ul</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9681">#9681</a> feat(swizzle): ask user preferred language if no language CLI option provided (<a href="https://github.com/yixiaojiu"><code>@​yixiaojiu</code></a>)</li>
</ul>
</li>
<li><code>create-docusaurus</code>, <code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9442">#9442</a> feat(create-docusaurus): ask user for preferred language when no language CLI option provided (<a href="https://github.com/Rafael-Martins"><code>@​Rafael-Martins</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-vercel-analytics</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9687">#9687</a> feat(plugin-vercel-analytics): add new vercel analytics plugin (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9684">#9684</a> feat(mdx-loader): the table-of-contents should display toc/headings of imported MDX partials (<a href="https://github.com/anatolykopyl"><code>@​anatolykopyl</code></a>)</li>
</ul>
</li>
</ul>
<h4>🐛 Bug Fix</h4>
<ul>
<li><code>docusaurus-mdx-loader</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9999">#9999</a> fix(mdx-loader): Ignore contentTitle coming after Markdown thematicBreak (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-search-algolia</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9945">#9945</a> fix(a11y): move focus algolia-search focus back to search input on Escape (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-plugin-content-blog</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9920">#9920</a> fix(blog): apply trailing slash to blog feed (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9944">#9944</a> fix(theme): improve a11y of DocSidebarItemCategory expand/collapsed button (<a href="https://github.com/mxschmitt"><code>@​mxschmitt</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9915">#9915</a> fix(theme-translations): complete and modify Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9910">#9910</a> fix(theme-translations): add Japanese translations (<a href="https://github.com/Suenaga-Ryuya"><code>@​Suenaga-Ryuya</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9872">#9872</a> fix(theme-translations): complete and improve Spanish theme translations (<a href="https://github.com/4troDev"><code>@​4troDev</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9812">#9812</a> fix(i18n): add missing theme translations for fa locale (<a href="https://github.com/VahidNaderi"><code>@​VahidNaderi</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-utils</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9897">#9897</a> fix(mdx-loader): mdx-code-block should support CRLF (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9878">#9878</a> fix(core): fix default i18n calendar used, infer it from locale if possible (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9852">#9852</a> fix(core): ensure core error boundary is able to render theme layout (<a href="https://github.com/slorber"><code>@​slorber</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-remark-plugin-npm2yarn</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9861">#9861</a> fix(remark-npm2yarn): update npm-to-yarn from 2.0.0 to 2.2.1, fix pnpm extra args syntax (<a href="https://github.com/OzakIOne"><code>@​OzakIOne</code></a>)</li>
</ul>
</li>
<li><code>docusaurus-theme-classic</code>, <code>docusaurus-theme-translations</code>
<ul>
<li><a href="https://redirect.github.com/facebook/docusaurus/pull/9851">#9851</a> fix(theme-classic): should use plurals for category items description (<a href="https://github.com/baradusov"><code>@​baradusov</code></a>)</li>
</ul>
</li>
</ul>
<h4>🏃‍♀️ Performance</h4>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="5af143651b"><code>5af1436</code></a> v3.2.0</li>
<li><a href="4388267c26"><code>4388267</code></a> fix(core): various broken anchor link fixes (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/9732">#9732</a>)</li>
<li><a href="fd49301a45"><code>fd49301</code></a> feat(core): make broken link checker detect broken anchors - add `onBrokenAnc...</li>
<li><a href="8dd1e13f2a"><code>8dd1e13</code></a> fix(type-aliases): add <code>title</code> prop for imported inline SVG React components ...</li>
<li><a href="fa1ce230ea"><code>fa1ce23</code></a> refactor: capitalize comments (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/7188">#7188</a>)</li>
<li><a href="171927342f"><code>1719273</code></a> feat(core): fail-safe global data fetching (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/7083">#7083</a>)</li>
<li><a href="24c205a835"><code>24c205a</code></a> refactor: replace non-prop interface with type; allow plugin lifecycles to ha...</li>
<li><a href="3f33e90704"><code>3f33e90</code></a> chore: upgrade dependencies (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/7065">#7065</a>)</li>
<li><a href="77662260f8"><code>7766226</code></a> refactor(core): refactor routes generation logic (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/7054">#7054</a>)</li>
<li><a href="5fb09a2946"><code>5fb09a2</code></a> refactor(core): reorganize files (<a href="https://github.com/facebook/docusaurus/tree/HEAD/packages/docusaurus-module-type-aliases/issues/7042">#7042</a>)</li>
<li>See full diff in <a href="https://github.com/facebook/docusaurus/commits/v3.2.0/packages/docusaurus-module-type-aliases">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=@docusaurus/module-type-aliases&package-manager=npm_and_yarn&previous-version=3.1.1&new-version=3.2.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-04-01 05:38:16 +00:00
dependabot[bot]
6c9de9bef3
⬆️ Bump express from 4.18.2 to 4.19.2 in /website (#5276)
Bumps [express](https://github.com/expressjs/express) from 4.18.2 to 4.19.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/expressjs/express/releases">express's releases</a>.</em></p>
<blockquote>
<h2>4.19.2</h2>
<h2>What's Changed</h2>
<ul>
<li><a href="0b746953c4">Improved fix for open redirect allow list bypass</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/expressjs/express/compare/4.19.1...4.19.2">https://github.com/expressjs/express/compare/4.19.1...4.19.2</a></p>
<h2>4.19.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Fix ci after location patch by <a href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5552">expressjs/express#5552</a></li>
<li>fixed un-edited version in history.md for 4.19.0 by <a href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5556">expressjs/express#5556</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/expressjs/express/compare/4.19.0...4.19.1">https://github.com/expressjs/express/compare/4.19.0...4.19.1</a></p>
<h2>4.19.0</h2>
<h2>What's Changed</h2>
<ul>
<li>fix typo in release date by <a href="https://github.com/UlisesGascon"><code>@​UlisesGascon</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5527">expressjs/express#5527</a></li>
<li>docs: nominating <a href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> to be project captian by <a href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5511">expressjs/express#5511</a></li>
<li>docs: loosen TC activity rules by <a href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5510">expressjs/express#5510</a></li>
<li>Add note on how to update docs for new release by <a href="https://github.com/crandmck"><code>@​crandmck</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5541">expressjs/express#5541</a></li>
<li><a href="660ccf5fa3">Prevent open redirect allow list bypass due to encodeurl</a></li>
<li>Release 4.19.0 by <a href="https://github.com/wesleytodd"><code>@​wesleytodd</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5551">expressjs/express#5551</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/crandmck"><code>@​crandmck</code></a> made their first contribution in <a href="https://redirect.github.com/expressjs/express/pull/5541">expressjs/express#5541</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/expressjs/express/compare/4.18.3...4.19.0">https://github.com/expressjs/express/compare/4.18.3...4.19.0</a></p>
<h2>4.18.3</h2>
<h2>Main Changes</h2>
<ul>
<li>Fix routing requests without method</li>
<li>deps: body-parser@1.20.2
<ul>
<li>Fix strict json error message on Node.js 19+</li>
<li>deps: content-type@~1.0.5</li>
<li>deps: raw-body@2.5.2</li>
</ul>
</li>
</ul>
<h2>Other Changes</h2>
<ul>
<li>Use https: protocol instead of deprecated git: protocol by <a href="https://github.com/vcsjones"><code>@​vcsjones</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5032">expressjs/express#5032</a></li>
<li>build: Node.js@16.18 and Node.js@18.12 by <a href="https://github.com/abenhamdine"><code>@​abenhamdine</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5034">expressjs/express#5034</a></li>
<li>ci: update actions/checkout to v3 by <a href="https://github.com/armujahid"><code>@​armujahid</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5027">expressjs/express#5027</a></li>
<li>test: remove unused function arguments in params by <a href="https://github.com/raksbisht"><code>@​raksbisht</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5124">expressjs/express#5124</a></li>
<li>Remove unused originalIndex from acceptParams by <a href="https://github.com/raksbisht"><code>@​raksbisht</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5119">expressjs/express#5119</a></li>
<li>Fixed typos by <a href="https://github.com/raksbisht"><code>@​raksbisht</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5117">expressjs/express#5117</a></li>
<li>examples: remove unused params by <a href="https://github.com/raksbisht"><code>@​raksbisht</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5113">expressjs/express#5113</a></li>
<li>fix: parameter str is not described in JSDoc by <a href="https://github.com/raksbisht"><code>@​raksbisht</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5130">expressjs/express#5130</a></li>
<li>fix: typos in History.md by <a href="https://github.com/raksbisht"><code>@​raksbisht</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5131">expressjs/express#5131</a></li>
<li>build : add Node.js@19.7 by <a href="https://github.com/abenhamdine"><code>@​abenhamdine</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5028">expressjs/express#5028</a></li>
<li>test: remove unused function arguments in params by <a href="https://github.com/raksbisht"><code>@​raksbisht</code></a> in <a href="https://redirect.github.com/expressjs/express/pull/5137">expressjs/express#5137</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/expressjs/express/blob/master/History.md">express's changelog</a>.</em></p>
<blockquote>
<h1>4.19.2 / 2024-03-25</h1>
<ul>
<li>Improved fix for open redirect allow list bypass</li>
</ul>
<h1>4.19.1 / 2024-03-20</h1>
<ul>
<li>Allow passing non-strings to res.location with new encoding handling checks</li>
</ul>
<h1>4.19.0 / 2024-03-20</h1>
<ul>
<li>Prevent open redirect allow list bypass due to encodeurl</li>
<li>deps: cookie@0.6.0</li>
</ul>
<h1>4.18.3 / 2024-02-29</h1>
<ul>
<li>Fix routing requests without method</li>
<li>deps: body-parser@1.20.2
<ul>
<li>Fix strict json error message on Node.js 19+</li>
<li>deps: content-type@~1.0.5</li>
<li>deps: raw-body@2.5.2</li>
</ul>
</li>
<li>deps: cookie@0.6.0
<ul>
<li>Add <code>partitioned</code> option</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="04bc62787b"><code>04bc627</code></a> 4.19.2</li>
<li><a href="da4d763ff6"><code>da4d763</code></a> Improved fix for open redirect allow list bypass</li>
<li><a href="4f0f6cc67d"><code>4f0f6cc</code></a> 4.19.1</li>
<li><a href="a003cfab03"><code>a003cfa</code></a> Allow passing non-strings to res.location with new encoding handling checks f...</li>
<li><a href="a1fa90fcea"><code>a1fa90f</code></a> fixed un-edited version in history.md for 4.19.0</li>
<li><a href="11f2b1db22"><code>11f2b1d</code></a> build: fix build due to inconsistent supertest behavior in older versions</li>
<li><a href="084e36506a"><code>084e365</code></a> 4.19.0</li>
<li><a href="0867302ddb"><code>0867302</code></a> Prevent open redirect allow list bypass due to encodeurl</li>
<li><a href="567c9c665d"><code>567c9c6</code></a> Add note on how to update docs for new release (<a href="https://redirect.github.com/expressjs/express/issues/5541">#5541</a>)</li>
<li><a href="69a4cf2819"><code>69a4cf2</code></a> deps: cookie@0.6.0</li>
<li>Additional commits viewable in <a href="https://github.com/expressjs/express/compare/4.18.2...4.19.2">compare view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a href="https://www.npmjs.com/~wesleytodd">wesleytodd</a>, a new releaser for express since your current version.</p>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=express&package-manager=npm_and_yarn&previous-version=4.18.2&new-version=4.19.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/alcionai/corso/network/alerts).

</details>
2024-03-25 21:07:53 +00:00
ashmrtn
686867bd96
Pull in kopia with fixed manifest error wrap (#5273)
Updated version only contains the change to not clobber manifest compaction error messages.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-03-25 20:57:33 +00:00
ashmrtn
cd41d2fbce
Reduce test to just a single site to avoid failure (#5274)
#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-03-25 20:16:43 +00:00
dependabot[bot]
2b79c1b797
⬆️ Bump sass from 1.71.0 to 1.72.0 in /website (#5262)
Bumps [sass](https://github.com/sass/dart-sass) from 1.71.0 to 1.72.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.72.0</h2>
<p>To install Sass 1.72.0, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<ul>
<li>
<p>Support adjacent <code>/</code>s without whitespace in between when parsing plain CSS expressions.</p>
</li>
<li>
<p>Allow the Node.js <code>pkg:</code> importer to load Sass stylesheets for <code>package.json</code> <code>exports</code> field entries without extensions.</p>
</li>
<li>
<p>When printing suggestions for variables, use underscores in variable names when the original usage used underscores.</p>
</li>
</ul>
<h3>JavaScript API</h3>
<ul>
<li>Properly resolve <code>pkg:</code> imports with the Node.js package importer when arguments are passed to the JavaScript process.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1720">full changelog</a> for changes in earlier releases.</p>
<h2>Dart Sass 1.71.1</h2>
<p>To install Sass 1.71.1, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<h3>Command-Line Interface</h3>
<ul>
<li>Ship the musl Linux release with the proper Dart executable.</li>
</ul>
<h3>JavaScript API</h3>
<ul>
<li>
<p>Export the <code>NodePackageImporter</code> class in ESM mode.</p>
</li>
<li>
<p>Allow <code>NodePackageImporter</code> to locate a default directory even when the entrypoint is an ESM module.</p>
</li>
</ul>
<h3>Dart API</h3>
<ul>
<li>Make passing a null argument to <code>NodePackageImporter()</code> a static error rather than just a runtime error.</li>
</ul>
<h3>Embedded Sass</h3>
<ul>
<li>In the JS Embedded Host, properly install the musl Linux embedded compiler when running on musl Linux.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1711">full changelog</a> for changes in earlier releases.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.72.0</h2>
<ul>
<li>
<p>Support adjacent <code>/</code>s without whitespace in between when parsing plain CSS
expressions.</p>
</li>
<li>
<p>Allow the Node.js <code>pkg:</code> importer to load Sass stylesheets for <code>package.json</code>
<code>exports</code> field entries without extensions.</p>
</li>
<li>
<p>When printing suggestions for variables, use underscores in variable names
when the original usage used underscores.</p>
</li>
</ul>
<h3>JavaScript API</h3>
<ul>
<li>Properly resolve <code>pkg:</code> imports with the Node.js package importer when
arguments are passed to the JavaScript process.</li>
</ul>
<h2>1.71.1</h2>
<h3>Command-Line Interface</h3>
<ul>
<li>Ship the musl Linux release with the proper Dart executable.</li>
</ul>
<h3>JavaScript API</h3>
<ul>
<li>
<p>Export the <code>NodePackageImporter</code> class in ESM mode.</p>
</li>
<li>
<p>Allow <code>NodePackageImporter</code> to locate a default directory even when the
entrypoint is an ESM module.</p>
</li>
</ul>
<h3>Dart API</h3>
<ul>
<li>Make passing a null argument to <code>NodePackageImporter()</code> a static error rather
than just a runtime error.</li>
</ul>
<h3>Embedded Sass</h3>
<ul>
<li>In the JS Embedded Host, properly install the musl Linux embedded compiler
when running on musl Linux.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="ce16b35ca1"><code>ce16b35</code></a> Cut a release (<a href="https://redirect.github.com/sass/dart-sass/issues/2194">#2194</a>)</li>
<li><a href="9af6bbf8a0"><code>9af6bbf</code></a> Properly handle <code>pkg:</code> imports with args (<a href="https://redirect.github.com/sass/dart-sass/issues/2193">#2193</a>)</li>
<li><a href="033049102b"><code>0330491</code></a> Update to node 20 (<a href="https://redirect.github.com/sass/dart-sass/issues/2192">#2192</a>)</li>
<li><a href="48e2d0cb02"><code>48e2d0c</code></a> Preserve underscores in <code>VariableExpression.toString()</code> (<a href="https://redirect.github.com/sass/dart-sass/issues/2185">#2185</a>)</li>
<li><a href="6e2d637ac3"><code>6e2d637</code></a> Allow adjacent forward slashes in plain CSS expressions (<a href="https://redirect.github.com/sass/dart-sass/issues/2190">#2190</a>)</li>
<li><a href="fa4d909f92"><code>fa4d909</code></a> Bump softprops/action-gh-release from 1 to 2 (<a href="https://redirect.github.com/sass/dart-sass/issues/2191">#2191</a>)</li>
<li><a href="fd67fe678c"><code>fd67fe6</code></a> [Hotfix Node Package Importer]- Handle subpath without extensions (<a href="https://redirect.github.com/sass/dart-sass/issues/2184">#2184</a>)</li>
<li><a href="1b4d703ad3"><code>1b4d703</code></a> Release 1.71.1 (<a href="https://redirect.github.com/sass/dart-sass/issues/2182">#2182</a>)</li>
<li><a href="6d66c4376a"><code>6d66c43</code></a> Properly handle <code>new NodePackageImporter()</code> with an ESM entrypoint (<a href="https://redirect.github.com/sass/dart-sass/issues/2181">#2181</a>)</li>
<li><a href="85a932f648"><code>85a932f</code></a> Add missing ESM export of NodePackageImporter (<a href="https://redirect.github.com/sass/dart-sass/issues/2177">#2177</a>)</li>
<li>Additional commits viewable in <a href="https://github.com/sass/dart-sass/compare/1.71.0...1.72.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.71.0&new-version=1.72.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-03-14 05:44:43 +00:00
dependabot[bot]
2ab6d34538
⬆️ Bump mermaid from 10.8.0 to 10.9.0 in /website (#5256)
Bumps [mermaid](https://github.com/mermaid-js/mermaid) from 10.8.0 to 10.9.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/mermaid-js/mermaid/releases">mermaid's releases</a>.</em></p>
<blockquote>
<h2>v10.9.0</h2>
<h1>Release Notes</h1>
<p>We now have Katex support!</p>
<p><img src="https://user-images.githubusercontent.com/16135852/161395052-4dee19da-faa6-4934-94f3-dd0cbcecfccd.png" alt="image" />
<a href="https://mermaid.live/edit#pako:eNptUl1r6zAM_SvClJHQ5mFh7I7CBuu2tz3tPl7dDsdxUkOTdI46Mlz998lJCh3ML5KOdc7xh4IyXWnVWtVeH3bw-oYtyHr8h2qxqBJEb_d6CAOncA-IrqX3kMVY0Rdv50Q2dppCxdI_uBRxZbchF_TgwEHEYGBBy5guFqj-Q5Y9nKIF4sbVtfCKKUg1FYmEymsTskJ0GsT-w1Motnl2ow1zyDWnY2M6kqZ8Luo0mpxgk0SL6-WsJIfK8ngoZgg_0Jtf0dtf0btL1JQd9SwrGqbz211crvu0vhCmDXpZLA1vAyLZgULbkZ1YJ3hKpubDTvc2ZH_u5F2N8-ZSdBNF4XnsHMavKGzt2mCE0jNouJqFXQUMhewjgvmJloLatpw5F-JPo_jLJJ5Qem_er88GRaPJu4Hl003XB2K5dS848WiRn-FZ-9weB4AmD7VSjfWNdqWMWYiOqGhnG4tqLWlpK33cEypsWVr1kbq_X61Ra_JHu1LHQ6nJPjstA9qodaX3veVvObPoTQ">Demo</a></p>
<h2>🚀 Features</h2>
<ul>
<li>Bump <code>@​zenuml/core</code> and update render options in mermaid-zenuml (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5257">#5257</a>) <a href="https://github.com/dontry"><code>@​dontry</code></a></li>
<li>Implement &quot;until&quot; keyword in gantt charts (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5224">#5224</a>) <a href="https://github.com/fzag"><code>@​fzag</code></a></li>
<li>Integrated Katex typesetting into flowcharts (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/2885">#2885</a>) <a href="https://github.com/NicolasNewman"><code>@​NicolasNewman</code></a></li>
</ul>
<h2>🧰 Maintenance</h2>
<ul>
<li>Add gitgraph parallel commits to docs (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5336">#5336</a>) <a href="https://github.com/NicolasCwy"><code>@​NicolasCwy</code></a></li>
<li>Update recommended Vitest extension (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5322">#5322</a>) <a href="https://github.com/NicolasCwy"><code>@​NicolasCwy</code></a></li>
<li>Correcting path to docker-entrypoint.sh (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5327">#5327</a>) <a href="https://github.com/bstordrup"><code>@​bstordrup</code></a></li>
<li>Fix chrome webstore url causing 404 (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5352">#5352</a>) <a href="https://github.com/Abrifq"><code>@​Abrifq</code></a></li>
<li>Fix color and arrow for merge commit (gitGraph) (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5152">#5152</a>) <a href="https://github.com/macherel"><code>@​macherel</code></a></li>
<li>Fix link to Contributors section in README (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5298">#5298</a>) <a href="https://github.com/BaumiCoder"><code>@​BaumiCoder</code></a></li>
<li>Fix macOS onboarding issues (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5262">#5262</a>) <a href="https://github.com/thedustin"><code>@​thedustin</code></a></li>
<li>Fix netlify deploy (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5332">#5332</a>) <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a></li>
<li>Link to webhelp (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5316">#5316</a>) <a href="https://github.com/BaumiCoder"><code>@​BaumiCoder</code></a></li>
<li>Update contribute (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5268">#5268</a>) <a href="https://github.com/FutzMonitor"><code>@​FutzMonitor</code></a></li>
<li>Updates Timeline Documentation (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5315">#5315</a>) <a href="https://github.com/FutzMonitor"><code>@​FutzMonitor</code></a></li>
<li>[Docs] Updated chrome extension url's to new store (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5297">#5297</a>) <a href="https://github.com/Abrifq"><code>@​Abrifq</code></a></li>
<li>chore: Update CSpell configuration (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5286">#5286</a>) <a href="https://github.com/Jason3S"><code>@​Jason3S</code></a></li>
<li>feat: add name field to the actors (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5284">#5284</a>) <a href="https://github.com/ad1992"><code>@​ad1992</code></a></li>
<li>fix typo cutomers =&gt; customers (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5269">#5269</a>) <a href="https://github.com/elgalu"><code>@​elgalu</code></a></li>
</ul>
<h2>📚 Documentation</h2>
<ul>
<li>Add new extension to integrations (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5287">#5287</a>) <a href="https://github.com/BoDonkey"><code>@​BoDonkey</code></a></li>
<li>Added link to Blazorade Mermaid to the community integrations page. (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5333">#5333</a>) <a href="https://github.com/MikaBerglund"><code>@​MikaBerglund</code></a></li>
<li>Replace version number placeholder (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5304">#5304</a>) <a href="https://github.com/BaumiCoder"><code>@​BaumiCoder</code></a></li>
</ul>
<p>🎉 <strong>Thanks to all contributors helping with this release!</strong> 🎉</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="da33867ad7"><code>da33867</code></a> Draft release</li>
<li><a href="539010c65c"><code>539010c</code></a> Merge pull request <a href="https://redirect.github.com/mermaid-js/mermaid/issues/5337">#5337</a> from mermaid-js/release/10.9.0</li>
<li><a href="cbe44a6cff"><code>cbe44a6</code></a> v10.9.0</li>
<li><a href="b077fedd4c"><code>b077fed</code></a> Merge branch 'develop' into release/10.9.0</li>
<li><a href="5aa884f594"><code>5aa884f</code></a> Merge pull request <a href="https://redirect.github.com/mermaid-js/mermaid/issues/5354">#5354</a> from mermaid-js/renovate/patch-all-patch</li>
<li><a href="5b3f320e5d"><code>5b3f320</code></a> Merge branch 'develop' into renovate/patch-all-patch</li>
<li><a href="803e068630"><code>803e068</code></a> Merge branch 'master' into develop</li>
<li><a href="3147bb34ee"><code>3147bb3</code></a> Update docs</li>
<li><a href="8daa28dd8b"><code>8daa28d</code></a> Lychee ignore chrome webstore</li>
<li><a href="231534a0db"><code>231534a</code></a> Update link</li>
<li>Additional commits viewable in <a href="https://github.com/mermaid-js/mermaid/compare/v10.8.0...v10.9.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mermaid&package-manager=npm_and_yarn&previous-version=10.8.0&new-version=10.9.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-03-06 05:45:41 +00:00
Abin Simon
e0884c734c
Fix ics high memory usage by disabling auto wrap text in html2text (#5244)
Related: jaytaylor/html2text#48<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-23 06:20:59 +00:00
ashmrtn
f3fdb4a885
harden sanitree population (#5237)
Allow sanity tree checking to require multiple folder subtrees have no
errors. This allows us to ensure both the source folder subtree and
restore folder subtree are populated without issue.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [x] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-21 23:53:12 +00:00
ashmrtn
f4dbaf60b0
Normalize case when checking for user email (#5240)
Reduce chance of test failures by normalizing the case prior to comparison. This should hopefully reduce spurious test failures.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-20 22:45:21 +00:00
Niraj Tolia
b9b5650506
Tweak website (#5238)
#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🗺️ Documentation
2024-02-19 04:56:16 +00:00
Abhishek Pandey
f28e79c098
Log token lifetimes on 401 errors (#5239)
<!-- PR description-->

* When we encounter 401s, process the JWT token present in the [`Authorization` header](https://learn.microsoft.com/en-us/entra/identity-platform/v2-oauth2-client-creds-grant-flow#use-a-token) of request. 
* Dump the issued at time and expires at time for the token. 
---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-18 00:20:55 +00:00
Abin Simon
42af271526
Close body of file after writing to zip file in export (#5234)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-17 17:35:18 +00:00
dependabot[bot]
d87435fdc2
⬆️ Bump sass from 1.70.0 to 1.71.0 in /website (#5235)
Bumps [sass](https://github.com/sass/dart-sass) from 1.70.0 to 1.71.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/releases">sass's releases</a>.</em></p>
<blockquote>
<h2>Dart Sass 1.71.0</h2>
<p>To install Sass 1.71.0, download one of the packages below and <a href="https://katiek2.github.io/path-doc/">add it to your PATH</a>, or see <a href="https://sass-lang.com/install">the Sass website</a> for full installation instructions.</p>
<h1>Changes</h1>
<p>For more information about <code>pkg:</code> importers, see <a href="https://sass-lang.com/blog/announcing-pkg-importers">the announcement</a> on the Sass blog.</p>
<h3>Command-Line Interface</h3>
<ul>
<li>Add a <code>--pkg-importer</code> flag to enable built-in <code>pkg:</code> importers. Currently this only supports the Node.js package resolution algorithm, via <code>--pkg-importer=node</code>. For example, <code>@use &quot;pkg:bootstrap&quot;</code> will load <code>node_modules/bootstrap/scss/bootstrap.scss</code>.</li>
</ul>
<h3>JavaScript API</h3>
<ul>
<li>Add a <code>NodePackageImporter</code> importer that can be passed to the <code>importers</code> option. This loads files using the <code>pkg:</code> URL scheme according to the Node.js package resolution algorithm. For example, <code>@use &quot;pkg:bootstrap&quot;</code> will load <code>node_modules/bootstrap/scss/bootstrap.scss</code>. The constructor takes a single optional argument, which indicates the base directory to use when locating <code>node_modules</code> directories. It defaults to <code>path.dirname(require.main.filename)</code>.</li>
</ul>
<h3>Dart API</h3>
<ul>
<li>Add a <code>NodePackageImporter</code> importer that can be passed to the <code>importers</code> option. This loads files using the <code>pkg:</code> URL scheme according to the Node.js package resolution algorithm. For example, <code>@use &quot;pkg:bootstrap&quot;</code> will load <code>node_modules/bootstrap/scss/bootstrap.scss</code>. The constructor takes a single argument, which indicates the base directory to use when locating <code>node_modules</code> directories.</li>
</ul>
<p>See the <a href="https://github.com/sass/dart-sass/blob/master/CHANGELOG.md#1710">full changelog</a> for changes in earlier releases.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/sass/dart-sass/blob/main/CHANGELOG.md">sass's changelog</a>.</em></p>
<blockquote>
<h2>1.71.0</h2>
<p>For more information about <code>pkg:</code> importers, see <a href="https://sass-lang.com/blog/announcing-pkg-importers">the
announcement</a> on the Sass blog.</p>
<h3>Command-Line Interface</h3>
<ul>
<li>Add a <code>--pkg-importer</code> flag to enable built-in <code>pkg:</code> importers. Currently
this only supports the Node.js package resolution algorithm, via
<code>--pkg-importer=node</code>. For example, <code>@use &quot;pkg:bootstrap&quot;</code> will load
<code>node_modules/bootstrap/scss/bootstrap.scss</code>.</li>
</ul>
<h3>JavaScript API</h3>
<ul>
<li>Add a <code>NodePackageImporter</code> importer that can be passed to the <code>importers</code>
option. This loads files using the <code>pkg:</code> URL scheme according to the Node.js
package resolution algorithm. For example, <code>@use &quot;pkg:bootstrap&quot;</code> will load
<code>node_modules/bootstrap/scss/bootstrap.scss</code>. The constructor takes a single
optional argument, which indicates the base directory to use when locating
<code>node_modules</code> directories. It defaults to
<code>path.dirname(require.main.filename)</code>.</li>
</ul>
<h3>Dart API</h3>
<ul>
<li>Add a <code>NodePackageImporter</code> importer that can be passed to the <code>importers</code>
option. This loads files using the <code>pkg:</code> URL scheme according to the Node.js
package resolution algorithm. For example, <code>@use &quot;pkg:bootstrap&quot;</code> will load
<code>node_modules/bootstrap/scss/bootstrap.scss</code>. The constructor takes a single
argument, which indicates the base directory to use when locating
<code>node_modules</code> directories.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="3e6721e79f"><code>3e6721e</code></a> Fix new static warnings with Dart 3.3 (<a href="https://redirect.github.com/sass/dart-sass/issues/2173">#2173</a>)</li>
<li><a href="2cab33e2b3"><code>2cab33e</code></a> Update the language revision in Homebrew on release (<a href="https://redirect.github.com/sass/dart-sass/issues/2171">#2171</a>)</li>
<li><a href="84ededd368"><code>84ededd</code></a> Use musl support in cli_pkg (<a href="https://redirect.github.com/sass/dart-sass/issues/2172">#2172</a>)</li>
<li><a href="00571ec531"><code>00571ec</code></a> Add a <code>--pkg-importer</code> flag (<a href="https://redirect.github.com/sass/dart-sass/issues/2169">#2169</a>)</li>
<li><a href="84f31f0def"><code>84f31f0</code></a> Update pubspec/changelog for <code>pkg:</code> importers (<a href="https://redirect.github.com/sass/dart-sass/issues/2168">#2168</a>)</li>
<li><a href="9ee5408211"><code>9ee5408</code></a> [Package Importer] Dart Implementation (<a href="https://redirect.github.com/sass/dart-sass/issues/2130">#2130</a>)</li>
<li><a href="9423aa53ae"><code>9423aa5</code></a> Use macos-14 runner instead of macos-latest-xlarge runner (<a href="https://redirect.github.com/sass/dart-sass/issues/2167">#2167</a>)</li>
<li><a href="bbf97b4fb4"><code>bbf97b4</code></a> Remove the sass dependency from package.json (<a href="https://redirect.github.com/sass/dart-sass/issues/2162">#2162</a>)</li>
<li>See full diff in <a href="https://github.com/sass/dart-sass/compare/1.70.0...1.71.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sass&package-manager=npm_and_yarn&previous-version=1.70.0&new-version=1.71.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-02-16 05:34:00 +00:00
Keepers
8bdf86bbad
apply missing opts to api client (#5233)
#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🐛 Bugfix

#### Test Plan

- [x]  Unit test
- [x] 💚 E2E
2024-02-15 17:25:49 +00:00
Abin Simon
bf52fdbe6a
Log every 1000 items when exporting (#5227)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-15 06:45:16 +00:00
ashmrtn
90d6db486b
Log if the context expired during retry (#5229)
#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-15 01:24:42 +00:00
Keepers
f10730cf98
add new control opt for skipping event 503s (#5223)
adds a new control option for skipping certain event item 503 failures.
Also adds a skip cause for that case. And exports the skipCause value
for future preparation.
2024-02-14 16:55:52 -07:00
Keepers
bb2bd6df3f
add authentication to requester (#5198)
the graph requester for large item downloads now includes the option to authenticate requests.  The option is configured at the time of creating the requester, therefore all requests using that servier are either authenticatd or not. In our case, we're opting to authenticate all requests, since we do not use this requester for non-graph api calls, and even if we did the addition of auth headers is likely benign.

---

#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🌻 Feature

#### Test Plan

- [x] 💚 E2E
2024-02-14 17:50:36 +00:00
ashmrtn
5e8407a970
Don't alert on old compressor (#5222)
When verifying the repo config, don't create an alert if the repo has the old s2-default compressor that we temporarily used.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-13 21:18:29 +00:00
Hitesh Pattanayak
4b56754546
sanitizes replyTo emailAddresses (#5221)
sanitizes replyTo emailAddresses based on:
- valid email address format
- valid DN format

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [x] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)
INC-43

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-02-13 18:52:11 +00:00
ashmrtn
28aba60cc5
Check for and retry 404s with no content (#5217)
We've started seeing 404 errors with no content being returned. Check for these in the http wrapper we use and retry them.

While graph SDK returns an error for this sort of situation it's a very basic error since it normally expects to parse info out of the response body. Therefore it should be safe to inject our own error that we can check for.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-13 02:26:04 +00:00
ashmrtn
03048a6ca8
Fix contains check for exchange nightly tests (#5214)
Data being checked drifted due to recent changes in the test helper
and possible CLI output changes

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-12 20:32:25 +00:00
dependabot[bot]
97535e2afc
⬆️ Bump golangci/golangci-lint-action from 3 to 4 (#5213)
Bumps [golangci/golangci-lint-action](https://github.com/golangci/golangci-lint-action) from 3 to 4.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/golangci/golangci-lint-action/releases">golangci/golangci-lint-action's releases</a>.</em></p>
<blockquote>
<h2>v4.0.0</h2>
<!-- raw HTML omitted -->
<h2>What's Changed</h2>
<h3>Documentation</h3>
<ul>
<li>docs: update examples by <a href="https://github.com/KunalSin9h"><code>@​KunalSin9h</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/826">golangci/golangci-lint-action#826</a></li>
<li>docs: update section about GitHub Annotations by <a href="https://github.com/JustinDFuller"><code>@​JustinDFuller</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/931">golangci/golangci-lint-action#931</a></li>
</ul>
<h3>Dependencies</h3>
<ul>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.3.0 to 6.4.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/829">golangci/golangci-lint-action#829</a></li>
<li>build(deps-dev): bump eslint-plugin-import from 2.28.0 to 2.28.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/830">golangci/golangci-lint-action#830</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.5.0 to 20.5.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/827">golangci/golangci-lint-action#827</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.3.0 to 6.4.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/831">golangci/golangci-lint-action#831</a></li>
<li>build(deps-dev): bump prettier from 3.0.1 to 3.0.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/828">golangci/golangci-lint-action#828</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.5.1 to 20.5.7 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/833">golangci/golangci-lint-action#833</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.4.0 to 6.4.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/834">golangci/golangci-lint-action#834</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.4.0 to 6.4.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/835">golangci/golangci-lint-action#835</a></li>
<li>build(deps-dev): bump eslint from 8.47.0 to 8.48.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/837">golangci/golangci-lint-action#837</a></li>
<li>build(deps-dev): bump typescript from 5.1.6 to 5.2.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/836">golangci/golangci-lint-action#836</a></li>
<li>build(deps): bump <code>@​types/semver</code> from 7.5.0 to 7.5.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/838">golangci/golangci-lint-action#838</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.4.1 to 6.5.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/839">golangci/golangci-lint-action#839</a></li>
<li>build(deps-dev): bump prettier from 3.0.2 to 3.0.3 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/842">golangci/golangci-lint-action#842</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.4.1 to 6.5.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/840">golangci/golangci-lint-action#840</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.5.7 to 20.5.9 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/841">golangci/golangci-lint-action#841</a></li>
<li>chore: bump to use node20 runtime, actions/checkout to v4 by <a href="https://github.com/chenrui333"><code>@​chenrui333</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/843">golangci/golangci-lint-action#843</a></li>
<li>build(deps): bump actions/checkout from 3 to 4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/845">golangci/golangci-lint-action#845</a></li>
<li>build(deps-dev): bump eslint from 8.48.0 to 8.49.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/846">golangci/golangci-lint-action#846</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.5.9 to 20.6.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/847">golangci/golangci-lint-action#847</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.5.0 to 6.6.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/848">golangci/golangci-lint-action#848</a></li>
<li>build(deps-dev): bump <code>@​vercel/ncc</code> from 0.36.1 to 0.38.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/850">golangci/golangci-lint-action#850</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.5.0 to 6.6.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/849">golangci/golangci-lint-action#849</a></li>
<li>build(deps): bump <code>@​types/semver</code> from 7.5.1 to 7.5.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/853">golangci/golangci-lint-action#853</a></li>
<li>build(deps): bump <code>@​types/tmp</code> from 0.2.3 to 0.2.4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/854">golangci/golangci-lint-action#854</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.6.0 to 6.7.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/855">golangci/golangci-lint-action#855</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.6.0 to 20.6.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/857">golangci/golangci-lint-action#857</a></li>
<li>build(deps): bump <code>@​actions/core</code> from 1.10.0 to 1.10.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/856">golangci/golangci-lint-action#856</a></li>
<li>build(deps-dev): bump eslint from 8.49.0 to 8.50.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/859">golangci/golangci-lint-action#859</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.6.2 to 20.6.5 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/860">golangci/golangci-lint-action#860</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.6.0 to 6.7.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/861">golangci/golangci-lint-action#861</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.7.0 to 6.7.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/862">golangci/golangci-lint-action#862</a></li>
<li>build(deps): bump <code>@​types/semver</code> from 7.5.2 to 7.5.3 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/864">golangci/golangci-lint-action#864</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.7.2 to 6.7.3 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/865">golangci/golangci-lint-action#865</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.6.5 to 20.8.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/867">golangci/golangci-lint-action#867</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.7.2 to 6.7.3 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/866">golangci/golangci-lint-action#866</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.7.3 to 6.7.4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/868">golangci/golangci-lint-action#868</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.8.0 to 20.8.3 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/869">golangci/golangci-lint-action#869</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.7.3 to 6.7.4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/870">golangci/golangci-lint-action#870</a></li>
<li>build(deps-dev): bump eslint from 8.50.0 to 8.51.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/871">golangci/golangci-lint-action#871</a></li>
<li>build(deps): bump <code>@​actions/http-client</code> from 2.1.1 to 2.2.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/872">golangci/golangci-lint-action#872</a></li>
<li>build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.7.4 to 6.7.5 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/874">golangci/golangci-lint-action#874</a></li>
<li>build(deps): bump <code>@​types/node</code> from 20.8.3 to 20.8.6 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/golangci/golangci-lint-action/pull/875">golangci/golangci-lint-action#875</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="3cfe3a4abb"><code>3cfe3a4</code></a> build(deps): bump <code>@​actions/cache</code> from 3.2.3 to 3.2.4 (<a href="https://redirect.github.com/golangci/golangci-lint-action/issues/963">#963</a>)</li>
<li><a href="cbc59cf0d1"><code>cbc59cf</code></a> build(deps-dev): bump prettier from 3.2.4 to 3.2.5 (<a href="https://redirect.github.com/golangci/golangci-lint-action/issues/960">#960</a>)</li>
<li><a href="459a04b021"><code>459a04b</code></a> build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.19.1 to 6.20.0 ...</li>
<li><a href="e2315b67db"><code>e2315b6</code></a> build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.19.1 to 6.20.0 (<a href="https://redirect.github.com/golangci/golangci-lint-action/issues/961">#961</a>)</li>
<li><a href="d6173a45d0"><code>d6173a4</code></a> build(deps): bump <code>@​types/node</code> from 20.11.10 to 20.11.16 (<a href="https://redirect.github.com/golangci/golangci-lint-action/issues/962">#962</a>)</li>
<li><a href="0e8f5bf773"><code>0e8f5bf</code></a> build(deps): bump <code>@​types/node</code> from 20.11.5 to 20.11.10 (<a href="https://redirect.github.com/golangci/golangci-lint-action/issues/958">#958</a>)</li>
<li><a href="349d20632d"><code>349d206</code></a> build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.19.0 to 6.19.1 ...</li>
<li><a href="2221aee284"><code>2221aee</code></a> build(deps-dev): bump <code>@​typescript-eslint/parser</code> from 6.18.1 to 6.19.1 (<a href="https://redirect.github.com/golangci/golangci-lint-action/issues/954">#954</a>)</li>
<li><a href="3b44ae5b24"><code>3b44ae5</code></a> build(deps-dev): bump <code>@​typescript-eslint/eslint-plugin</code> from 6.18.1 to 6.19.0 ...</li>
<li><a href="323b871bbc"><code>323b871</code></a> build(deps-dev): bump prettier from 3.2.2 to 3.2.4 (<a href="https://redirect.github.com/golangci/golangci-lint-action/issues/950">#950</a>)</li>
<li>Additional commits viewable in <a href="https://github.com/golangci/golangci-lint-action/compare/v3...v4">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golangci/golangci-lint-action&package-manager=github_actions&previous-version=3&new-version=4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-02-12 19:53:24 +00:00
dependabot[bot]
cd7450395e
⬆️ Bump github.com/minio/minio-go/v7 from 7.0.66 to 7.0.67 in /src (#5210)
Bumps [github.com/minio/minio-go/v7](https://github.com/minio/minio-go) from 7.0.66 to 7.0.67.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/minio/minio-go/releases">github.com/minio/minio-go/v7's releases</a>.</em></p>
<blockquote>
<h2>Bugfix release</h2>
<h2>What's Changed</h2>
<ul>
<li>detect offline for more valid errors by <a href="https://github.com/harshavardhana"><code>@​harshavardhana</code></a> in <a href="https://redirect.github.com/minio/minio-go/pull/1919">minio/minio-go#1919</a></li>
<li>NEW API: GetObjectAttributes by <a href="https://github.com/zveinn"><code>@​zveinn</code></a> in <a href="https://redirect.github.com/minio/minio-go/pull/1921">minio/minio-go#1921</a></li>
<li>fix: support more type to StringSet umnarshaJSON by <a href="https://github.com/jiuker"><code>@​jiuker</code></a> in <a href="https://redirect.github.com/minio/minio-go/pull/1925">minio/minio-go#1925</a></li>
<li>Update api-remove.go by <a href="https://github.com/fwessels"><code>@​fwessels</code></a> in <a href="https://redirect.github.com/minio/minio-go/pull/1926">minio/minio-go#1926</a></li>
<li>Enable --expired-object-all-versions by <a href="https://github.com/shtripat"><code>@​shtripat</code></a> in <a href="https://redirect.github.com/minio/minio-go/pull/1927">minio/minio-go#1927</a></li>
<li>fix: latest linter issues by <a href="https://github.com/harshavardhana"><code>@​harshavardhana</code></a> in <a href="https://redirect.github.com/minio/minio-go/pull/1929">minio/minio-go#1929</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/zveinn"><code>@​zveinn</code></a> made their first contribution in <a href="https://redirect.github.com/minio/minio-go/pull/1921">minio/minio-go#1921</a></li>
<li><a href="https://github.com/jiuker"><code>@​jiuker</code></a> made their first contribution in <a href="https://redirect.github.com/minio/minio-go/pull/1925">minio/minio-go#1925</a></li>
<li><a href="https://github.com/shtripat"><code>@​shtripat</code></a> made their first contribution in <a href="https://redirect.github.com/minio/minio-go/pull/1927">minio/minio-go#1927</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/minio/minio-go/compare/v7.0.66...v7.0.67">https://github.com/minio/minio-go/compare/v7.0.66...v7.0.67</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="99c6311121"><code>99c6311</code></a> fix: latest linter issues (<a href="https://redirect.github.com/minio/minio-go/issues/1929">#1929</a>)</li>
<li><a href="72b90536ab"><code>72b9053</code></a> Enable --expired-object-all-versions (<a href="https://redirect.github.com/minio/minio-go/issues/1927">#1927</a>)</li>
<li><a href="c6d47d8f4b"><code>c6d47d8</code></a> Update api-remove.go (<a href="https://redirect.github.com/minio/minio-go/issues/1926">#1926</a>)</li>
<li><a href="6ad2b4a178"><code>6ad2b4a</code></a> fix: support all types in StringSet JSON unmarshal (<a href="https://redirect.github.com/minio/minio-go/issues/1925">#1925</a>)</li>
<li><a href="76a41461fe"><code>76a4146</code></a> NEW API: GetObjectAttributes (<a href="https://redirect.github.com/minio/minio-go/issues/1921">#1921</a>)</li>
<li><a href="56d9949682"><code>56d9949</code></a> detect offline for more valid errors (<a href="https://redirect.github.com/minio/minio-go/issues/1919">#1919</a>)</li>
<li><a href="f86f90f5f4"><code>f86f90f</code></a> update x/crypto and add CREDITS</li>
<li><a href="594eb81116"><code>594eb81</code></a> Update version to next release</li>
<li>See full diff in <a href="https://github.com/minio/minio-go/compare/v7.0.66...v7.0.67">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/minio/minio-go/v7&package-manager=go_modules&previous-version=7.0.66&new-version=7.0.67)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-02-12 18:16:07 +00:00
Abhishek Pandey
411ef24024
Handle item attachments with missing name (#5209)
<!-- PR description-->

Extending the earlier fix in https://github.com/alcionai/corso/pull/5199 to `itemAttachments`. Posts are not impacted here since they don't have attachment types like messages do.

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-11 22:13:20 +00:00
Vaibhav Kamra
b3b52c0dfc
Do not backup shared calendars (#5207)
Skip backup of shared calendars. These will be backed up with the resource that owns the calendar.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [x] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-11 18:58:09 +00:00
Abin Simon
8502e1fee6
Use recurrence timezone for ics exports (#5206)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-10 08:50:38 +00:00
Abin Simon
f0b8041c3f
Fix possible panic in contacts fetch (#5205)
We were not checking for the error returned by the Get method before trying to use the result to get contact info.

<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-10 07:57:57 +00:00
Abhishek Pandey
f92f811559
Retry more errors in graph adapter wrapper (#5203)
<!-- PR description-->

* We have started seeing `io.ErrUnexpectedEOF` and `read: connection timed out` in the last 2 days for exchange backups. Retry those.
* Also increase retry count from 3 to 6. This is more of a hail mary to retry `InvalidAuthenticationToken` errors. We have observed that retries do help. But for a small set of requests, we end up exhausting retries and eventually fail with InvalidAuthenticationToken error. Hoping that bumping this to 6 will get us some relief. This fix may be removed if we find the rootcause/pattern behind this.
* The event list test was taking > 150 secs. Thought I'd push this change as its a small fix. I first thought my PR https://github.com/alcionai/corso/pull/5202 broke the test. So I investigated this.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-10 05:11:05 +00:00
ashmrtn
71a9087e4d
Add a fallback to non-delta pager during event item enumeration (#5201)
If the delta pager with a smaller page size also falls then attempt to
use the regular events endpoint to enumerate items

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [x] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-10 00:45:43 +00:00
Keepers
45886e2ad9
allow eml export when attachments have no name (#5199)
#### Does this PR need a docs update or release note?

- [x]  Yes, it's included

#### Type of change

- [x] 🐛 Bugfix

#### Test Plan

- [x]  Unit test
- [x] 💚 E2E
2024-02-09 18:41:10 +00:00
ashmrtn
7262d3b284
Remove deleted flag from test command (#5192)
Longevity tests run using the latest release of corso. Since we
recently made a release that contains the removal of this flag we need
to update the github action

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [x] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup


#### Test Plan

- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-08 17:31:30 +00:00
ashmrtn
2ab3c890b4
Retry events item enumeration with smaller page size for some cases (#5194)
Update the events item enumerator to switch to a smaller page size for
the delta pager if we fail enumeration with a 503 error and no content.
We've found that that situation is indicative of the Graph server
being slow and a smaller page size allows us to make progress still

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [x] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test (in another PR)
- [ ] 💚 E2E
2024-02-08 07:43:20 +00:00
Keepers
f00dd0f88a
introduce intg and unit test setup packages (#5140)
corso has a thousand bespoke approaches to setting the same info in all of its tests.  This is a first step towards minimizing and standardizing the lift around that work. Future PRs will distribute these packages through the repo.

---

#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🤖 Supportability/Tests
- [x] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x]  Unit test
- [x] 💚 E2E
2024-02-07 18:01:35 +00:00
Abin Simon
a2d40b4d38
Handle itemAttachments for emails (#5181)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* https://github.com/alcionai/corso/issues/5040

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-07 04:50:30 +00:00
Abhishek Pandey
c1ec1585a2
Skip exchange items which fail to download due to ErrorCorruptData error (#5191)
<!-- PR description-->

* We are seeing 500 errors from graph during exchange(email) backup. * Error is `:{"code":"ErrorCorruptData","message":"Data is corrupt., Invalid global object ID: some ID`. 
* Catch the error and add a skip since these items cannot be downloaded from graph. 

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-07 02:40:14 +00:00
Keepers
a680f13f84
add by-resp-code catcher to err transformer (#5184)
Start catching graph responses by broad response code, in addition to the response body details.

---

#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🌻 Feature

#### Test Plan

- [x]  Unit test
- [x] 💚 E2E
2024-02-06 18:02:34 +00:00
Abin Simon
9c8ac96aed
Skip emails with incorrect email when export ics file (#5190)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-06 17:03:58 +00:00
Hitesh Pattanayak
e6dd387811
Add v0.19.0 changelog (#5189)
marks unrelease changes for v0.19.0

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [x] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-06 13:36:27 +00:00
Hitesh Pattanayak
fb64a2f52b
adds sharepoint lists changelog (#5188)
adds lists support to change log

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [x] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-06 09:07:45 +00:00
Abhishek Pandey
0cde1a4778
Update docs for group mailbox release (#5187)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [x] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-02-06 08:34:52 +00:00
Keepers
e86592f51e
replace graph Wrap and Stack with clues (#5018)
Now that graph errors are always transformed as part of the graph client wrapper or http_wrapper, we don't need to call graph.Stack or graph.Wrap outside of those inner helpers any longer. This PR swaps all those graph calls with equivalent clues funcs as a cleanup.

Should not contain any logical changes.

---

#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🧹 Tech Debt/Cleanup

#### Issue(s)

* #4685 

#### Test Plan

- [x]  Unit test
- [x] 💚 E2E
2024-02-06 01:08:26 +00:00
Abhishek Pandey
53a0525bfd
Handle previous paths for group mailbox (#5154)
<!-- PR description-->

* Tombstone collections weren't being added because we were not processing prev paths for group mailbox category.
* Hence deleted conversations were being carried forward even if `--disable-incrementals` was used.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-05 23:10:09 +00:00
Abhishek Pandey
de22131b23
Split tombstone handling for groups services (#5170)
<!-- PR description-->

* Prerequisite to the currently failing PR https://github.com/alcionai/corso/pull/5154 
* Conversations tombstone IDs are currently 2 part ( `convID/threadID`).
* However, tombstones are looked up by `convID` part only, see [code](9a603d1f21/src/internal/m365/collection/groups/backup.go (L128)).
* Adding a change so that tombstone IDs are reduced to `convID`. Otherwise, we'd never delete tombstones while processing collections, and we'll run into `conflict: tombstone exists for a live collection` errors.
* This is safe to do as there is always a 1:1 relationship between `convID` and `threadID`. For e.g. attempting to create another thread inside a conversation creates a new conversation.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-02-05 22:24:09 +00:00
Keepers
4cf4c22259
skip chats e2e tests (#5183)
#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🐛 Bugfix
- [x] 🤖 Supportability/Tests
2024-02-05 21:44:37 +00:00
dependabot[bot]
b5ac65c3d0
⬆️ Bump peter-evans/slash-command-dispatch from 3 to 4 (#5178)
Bumps [peter-evans/slash-command-dispatch](https://github.com/peter-evans/slash-command-dispatch) from 3 to 4.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/peter-evans/slash-command-dispatch/releases">peter-evans/slash-command-dispatch's releases</a>.</em></p>
<blockquote>
<h2>Slash Command Dispatch v4.0.0</h2>
<p>⚙️  Updated runtime to Node.js 20</p>
<ul>
<li>The action now requires a minimum version of <a href="https://github.com/actions/runner/releases/tag/v2.308.0">v2.308.0</a> for the Actions runner. Update self-hosted runners to v2.308.0 or later to ensure compatibility.</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>build(deps-dev): bump <code>@​types/node</code> from 16.18.67 to 16.18.68 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/302">peter-evans/slash-command-dispatch#302</a></li>
<li>build(deps-dev): bump prettier from 3.1.0 to 3.1.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/303">peter-evans/slash-command-dispatch#303</a></li>
<li>build(deps-dev): bump jest-circus from 27.4.2 to 27.5.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/176">peter-evans/slash-command-dispatch#176</a></li>
<li>build(deps): bump actions/download-artifact from 3 to 4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/304">peter-evans/slash-command-dispatch#304</a></li>
<li>build(deps): bump actions/upload-artifact from 3 to 4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/305">peter-evans/slash-command-dispatch#305</a></li>
<li>build(deps-dev): bump eslint from 8.55.0 to 8.56.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/306">peter-evans/slash-command-dispatch#306</a></li>
<li>build(deps-dev): bump eslint-plugin-prettier from 5.0.1 to 5.1.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/308">peter-evans/slash-command-dispatch#308</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 16.18.68 to 16.18.69 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/309">peter-evans/slash-command-dispatch#309</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 16.18.69 to 16.18.70 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/310">peter-evans/slash-command-dispatch#310</a></li>
<li>build(deps-dev): bump eslint-plugin-prettier from 5.1.2 to 5.1.3 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/311">peter-evans/slash-command-dispatch#311</a></li>
<li>build(deps-dev): bump prettier from 3.1.1 to 3.2.2 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/312">peter-evans/slash-command-dispatch#312</a></li>
<li>build(deps-dev): bump prettier from 3.2.2 to 3.2.4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/314">peter-evans/slash-command-dispatch#314</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 16.18.70 to 16.18.74 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/315">peter-evans/slash-command-dispatch#315</a></li>
<li>build(deps): bump peter-evans/create-or-update-comment from 3 to 4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/317">peter-evans/slash-command-dispatch#317</a></li>
<li>build(deps-dev): bump <code>@​types/node</code> from 16.18.74 to 16.18.76 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/318">peter-evans/slash-command-dispatch#318</a></li>
<li>feat: update runtime to node 20 by <a href="https://github.com/peter-evans"><code>@​peter-evans</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/316">peter-evans/slash-command-dispatch#316</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/peter-evans/slash-command-dispatch/compare/v3.0.2...v4.0.0">https://github.com/peter-evans/slash-command-dispatch/compare/v3.0.2...v4.0.0</a></p>
<h2>Slash Command Dispatch v3.0.2</h2>
<h2>What's Changed</h2>
<ul>
<li>fix: replace use of any type by <a href="https://github.com/peter-evans"><code>@​peter-evans</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/152">peter-evans/slash-command-dispatch#152</a></li>
<li>fixing link to <code>Get a pull request</code> by <a href="https://github.com/Borda"><code>@​Borda</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/270">peter-evans/slash-command-dispatch#270</a></li>
<li>Update getting-started.md due to changes in Github UI by <a href="https://github.com/martin-displayr"><code>@​martin-displayr</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/289">peter-evans/slash-command-dispatch#289</a></li>
<li>[Security] Fix GraphQL query to get a collaborator's permission by <a href="https://github.com/0xn3va"><code>@​0xn3va</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/301">peter-evans/slash-command-dispatch#301</a></li>
<li>129 dependency updates by <a href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/Borda"><code>@​Borda</code></a> made their first contribution in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/270">peter-evans/slash-command-dispatch#270</a></li>
<li><a href="https://github.com/martin-displayr"><code>@​martin-displayr</code></a> made their first contribution in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/289">peter-evans/slash-command-dispatch#289</a></li>
<li><a href="https://github.com/0xn3va"><code>@​0xn3va</code></a> made their first contribution in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/301">peter-evans/slash-command-dispatch#301</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/peter-evans/slash-command-dispatch/compare/v3.0.1...v3.0.2">https://github.com/peter-evans/slash-command-dispatch/compare/v3.0.1...v3.0.2</a></p>
<h2>Slash Command Dispatch v3.0.1</h2>
<p>⚙️ Bumps <code>@actions/core</code> to transition away from <a href="https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/">deprecated runner commands</a>.</p>
<h2>What's Changed</h2>
<ul>
<li>build(deps): bump peter-evans/create-pull-request from 3 to 4 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/142">peter-evans/slash-command-dispatch#142</a></li>
<li>build(deps): bump <code>@​actions/core</code> from 1.6.0 to 1.9.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/144">peter-evans/slash-command-dispatch#144</a></li>
<li>Update distribution by <a href="https://github.com/github-actions"><code>@​github-actions</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/145">peter-evans/slash-command-dispatch#145</a></li>
<li>build(deps): bump <code>@​actions/core</code> from 1.9.1 to 1.10.0 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/149">peter-evans/slash-command-dispatch#149</a></li>
<li>Update distribution by <a href="https://github.com/github-actions"><code>@​github-actions</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/150">peter-evans/slash-command-dispatch#150</a></li>
<li>build(deps): bump <code>@​actions/github</code> from 5.0.0 to 5.1.1 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/peter-evans/slash-command-dispatch/pull/148">peter-evans/slash-command-dispatch#148</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="13bc09769d"><code>13bc097</code></a> feat: update runtime to node 20 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/316">#316</a>)</li>
<li><a href="128d8b9272"><code>128d8b9</code></a> build(deps-dev): bump <code>@​types/node</code> from 16.18.74 to 16.18.76 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/318">#318</a>)</li>
<li><a href="df481df93d"><code>df481df</code></a> build(deps): bump peter-evans/create-or-update-comment from 3 to 4 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/317">#317</a>)</li>
<li><a href="d4579a0d22"><code>d4579a0</code></a> build(deps-dev): bump <code>@​types/node</code> from 16.18.70 to 16.18.74 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/315">#315</a>)</li>
<li><a href="8f053eaa2e"><code>8f053ea</code></a> build(deps-dev): bump prettier from 3.2.2 to 3.2.4 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/314">#314</a>)</li>
<li><a href="b3eb783760"><code>b3eb783</code></a> build(deps-dev): bump prettier from 3.1.1 to 3.2.2 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/312">#312</a>)</li>
<li><a href="c0334d0fed"><code>c0334d0</code></a> build(deps-dev): bump eslint-plugin-prettier from 5.1.2 to 5.1.3 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/311">#311</a>)</li>
<li><a href="e627c61300"><code>e627c61</code></a> build(deps-dev): bump <code>@​types/node</code> from 16.18.69 to 16.18.70 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/310">#310</a>)</li>
<li><a href="5c23a33777"><code>5c23a33</code></a> build(deps-dev): bump <code>@​types/node</code> from 16.18.68 to 16.18.69 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/309">#309</a>)</li>
<li><a href="8dd62d5c45"><code>8dd62d5</code></a> build(deps-dev): bump eslint-plugin-prettier from 5.0.1 to 5.1.2 (<a href="https://redirect.github.com/peter-evans/slash-command-dispatch/issues/308">#308</a>)</li>
<li>Additional commits viewable in <a href="https://github.com/peter-evans/slash-command-dispatch/compare/v3...v4">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=peter-evans/slash-command-dispatch&package-manager=github_actions&previous-version=3&new-version=4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-02-05 21:05:21 +00:00
dependabot[bot]
b4b8088a97
⬆️ Bump mermaid from 10.7.0 to 10.8.0 in /website (#5179)
Bumps [mermaid](https://github.com/mermaid-js/mermaid) from 10.7.0 to 10.8.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/mermaid-js/mermaid/releases">mermaid's releases</a>.</em></p>
<blockquote>
<h1>v10.8.0</h1>
<h2>Features</h2>
<!-- raw HTML omitted -->
<ul>
<li>
<p>Adding new diagram type - Block Diagram by <a href="https://github.com/knsv"><code>@​knsv</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5221">mermaid-js/mermaid#5221</a></p>
</li>
<li>
<p>Feature/5114 add parallel commit config by <a href="https://github.com/mathbraga"><code>@​mathbraga</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5161">mermaid-js/mermaid#5161</a></p>
</li>
<li>
<p>Changes to Gantt Parsers to allow hashes and semicolons to titles, sections, and task data. by <a href="https://github.com/FutzMonitor"><code>@​FutzMonitor</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5095">mermaid-js/mermaid#5095</a></p>
</li>
<li>
<p>Feature/4653 add actor-top class to sequence diagram by <a href="https://github.com/Ronid1"><code>@​Ronid1</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5241">mermaid-js/mermaid#5241</a></p>
</li>
</ul>
<h2>Documentation</h2>
<ul>
<li>Updated gantt chart docs to show all config options by <a href="https://github.com/murdoa"><code>@​murdoa</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5192">mermaid-js/mermaid#5192</a></li>
<li>Contribution documentation improvements by <a href="https://github.com/nirname"><code>@​nirname</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5132">mermaid-js/mermaid#5132</a></li>
<li>Update flowchart.md - how to use font-awesome <a href="https://redirect.github.com/mermaid-js/mermaid/issues/5195">#5195</a> by <a href="https://github.com/arukiidou"><code>@​arukiidou</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5196">mermaid-js/mermaid#5196</a></li>
<li>Add more detailed docs for Gantt tasks by <a href="https://github.com/sorenisanerd"><code>@​sorenisanerd</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5194">mermaid-js/mermaid#5194</a></li>
<li>Docs/4974 reorder integration links by <a href="https://github.com/Ronid1"><code>@​Ronid1</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5066">mermaid-js/mermaid#5066</a></li>
<li>docs: fix swimm link by <a href="https://github.com/Yokozuna59"><code>@​Yokozuna59</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5219">mermaid-js/mermaid#5219</a></li>
<li>Update Slack community links to Discord by <a href="https://github.com/Olegt0rr"><code>@​Olegt0rr</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5225">mermaid-js/mermaid#5225</a></li>
<li>Docs: Mermaid chart updates by <a href="https://github.com/huynhicode"><code>@​huynhicode</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5232">mermaid-js/mermaid#5232</a></li>
<li>Fix typos in timeline syntax samples by <a href="https://github.com/sblom"><code>@​sblom</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5139">mermaid-js/mermaid#5139</a></li>
</ul>
<h2>Bug fixes</h2>
<ul>
<li>Bug/5059 fix external connection after updating edges by <a href="https://github.com/mathbraga"><code>@​mathbraga</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5127">mermaid-js/mermaid#5127</a></li>
<li>[Fix] Sequence diagram actor menu popup by <a href="https://github.com/vitorsss"><code>@​vitorsss</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5160">mermaid-js/mermaid#5160</a></li>
<li>fix: Dompurify Hooks by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5236">mermaid-js/mermaid#5236</a></li>
<li>Accurate pie chart labeling for text alignment by <a href="https://github.com/JenningsWilliam"><code>@​JenningsWilliam</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5141">mermaid-js/mermaid#5141</a></li>
<li>fix: Redirect of old URLs by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5250">mermaid-js/mermaid#5250</a></li>
<li>Fixed Typo in ErrorRenderer.ts by <a href="https://github.com/FutzMonitor"><code>@​FutzMonitor</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5256">mermaid-js/mermaid#5256</a></li>
</ul>
<h2>Chores</h2>
<ul>
<li>Revert &quot;Revert 5041 feature/4935 subgraph title margin config option&quot; by <a href="https://github.com/mathbraga"><code>@​mathbraga</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5205">mermaid-js/mermaid#5205</a></li>
<li>build(deps-dev): bump follow-redirects from 1.15.2 to 1.15.5 by <a href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5200">mermaid-js/mermaid#5200</a></li>
<li>chore(deps): update all patch dependencies (patch) by <a href="https://github.com/renovate"><code>@​renovate</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5150">mermaid-js/mermaid#5150</a></li>
<li>E2E Image comparison by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5208">mermaid-js/mermaid#5208</a></li>
<li>E2E test by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5210">mermaid-js/mermaid#5210</a></li>
<li>Optimise caching of test results by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5213">mermaid-js/mermaid#5213</a></li>
<li>Update update-browserlist.yml to fix deprecation and action fails by <a href="https://github.com/Abrifq"><code>@​Abrifq</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5151">mermaid-js/mermaid#5151</a></li>
<li>UpdateCypress by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5228">mermaid-js/mermaid#5228</a></li>
<li>Use node v20 by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5248">mermaid-js/mermaid#5248</a></li>
<li>Convert Mindmap to TS by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5247">mermaid-js/mermaid#5247</a></li>
<li>chore: Add interface naming Convention by <a href="https://github.com/sidharthv96"><code>@​sidharthv96</code></a> in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5254">mermaid-js/mermaid#5254</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/murdoa"><code>@​murdoa</code></a> made their first contribution in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5192">mermaid-js/mermaid#5192</a></li>
<li><a href="https://github.com/arukiidou"><code>@​arukiidou</code></a> made their first contribution in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5196">mermaid-js/mermaid#5196</a></li>
<li><a href="https://github.com/sorenisanerd"><code>@​sorenisanerd</code></a> made their first contribution in <a href="https://redirect.github.com/mermaid-js/mermaid/pull/5194">mermaid-js/mermaid#5194</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="51e7444b94"><code>51e7444</code></a> <a href="https://redirect.github.com/mermaid-js/mermaid/issues/3358">#3358</a> Layoutfix for growing parent when children spans new rows due to update...</li>
<li><a href="2fa9219353"><code>2fa9219</code></a> Update docs</li>
<li><a href="f5555245f9"><code>f555524</code></a> Mermaid version 10.8.0</li>
<li><a href="5c9857c4eb"><code>5c9857c</code></a> Merge pull request <a href="https://redirect.github.com/mermaid-js/mermaid/issues/5247">#5247</a> from mermaid-js/sidv/mindmapToTs</li>
<li><a href="494ba45c5e"><code>494ba45</code></a> Fixed Typo in ErrorRenderer.ts (<a href="https://redirect.github.com/mermaid-js/mermaid/issues/5256">#5256</a>)</li>
<li><a href="b38def6866"><code>b38def6</code></a> Merge pull request <a href="https://redirect.github.com/mermaid-js/mermaid/issues/5221">#5221</a> from mermaid-js/3358-blocks-diagram</li>
<li><a href="a7afc11079"><code>a7afc11</code></a> <a href="https://redirect.github.com/mermaid-js/mermaid/issues/3358">#3358</a> Removing redundant file</li>
<li><a href="d3c5b02008"><code>d3c5b02</code></a> Merge branch '3358-blocks-diagram' of github.com:mermaid-js/mermaid into 3358...</li>
<li><a href="16149abcc0"><code>16149ab</code></a> <a href="https://redirect.github.com/mermaid-js/mermaid/issues/3358">#3358</a> Fix after review</li>
<li><a href="13d0b61757"><code>13d0b61</code></a> Merge branch 'develop' into 3358-blocks-diagram</li>
<li>Additional commits viewable in <a href="https://github.com/mermaid-js/mermaid/compare/v10.7.0...v10.8.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mermaid&package-manager=npm_and_yarn&previous-version=10.7.0&new-version=10.8.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-02-05 05:33:58 +00:00
Hitesh Pattanayak
29f6582bc7
fixes failing nightly tests due to PR#5060 changes (#5176)
Fixes tests failing due to change in PR #5060 

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)
- https://github.com/alcionai/corso/actions/runs/7736081427
- https://github.com/alcionai/corso/actions/runs/7751227007

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-02-02 07:57:53 +00:00
dependabot[bot]
d983488154
⬆️ Bump docusaurus-plugin-image-zoom from 1.0.1 to 2.0.0 in /website (#5174)
Bumps [docusaurus-plugin-image-zoom](https://github.com/gabrielcsapo/docusaurus-plugin-image-zoom) from 1.0.1 to 2.0.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/gabrielcsapo/docusaurus-plugin-image-zoom/releases">docusaurus-plugin-image-zoom's releases</a>.</em></p>
<blockquote>
<h2>Release 2.0.0</h2>
<h4>💥 Breaking Change</h4>
<ul>
<li><a href="https://redirect.github.com/gabrielcsapo/docusaurus-plugin-image-zoom/pull/30">#30</a> Upgrading to Docusaurus v3 (<a href="https://github.com/scalvert"><code>@​scalvert</code></a>)</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li><a href="https://redirect.github.com/gabrielcsapo/docusaurus-plugin-image-zoom/pull/26">#26</a> Document update: Usage section (<a href="https://github.com/chellman"><code>@​chellman</code></a>)</li>
</ul>
<h4>Committers: 2</h4>
<ul>
<li>Steve Calvert (<a href="https://github.com/scalvert"><code>@​scalvert</code></a>)</li>
<li><a href="https://github.com/chellman"><code>@​chellman</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/gabrielcsapo/docusaurus-plugin-image-zoom/blob/main/CHANGELOG.md">docusaurus-plugin-image-zoom's changelog</a>.</em></p>
<blockquote>
<h2>v2.0.0 (2024-02-01)</h2>
<h4>💥 Breaking Change</h4>
<ul>
<li><a href="https://redirect.github.com/gabrielcsapo/docusaurus-plugin-image-zoom/pull/30">#30</a> Upgrading to Docusaurus v3 (<a href="https://github.com/scalvert"><code>@​scalvert</code></a>)</li>
</ul>
<h4>📝 Documentation</h4>
<ul>
<li><a href="https://redirect.github.com/gabrielcsapo/docusaurus-plugin-image-zoom/pull/26">#26</a> Document update: Usage section (<a href="https://github.com/chellman"><code>@​chellman</code></a>)</li>
</ul>
<h4>Committers: 2</h4>
<ul>
<li>Steve Calvert (<a href="https://github.com/scalvert"><code>@​scalvert</code></a>)</li>
<li><a href="https://github.com/chellman"><code>@​chellman</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="0e9c504a60"><code>0e9c504</code></a> Release 2.0.0</li>
<li><a href="78e132cf1a"><code>78e132c</code></a> Merge pull request <a href="https://redirect.github.com/gabrielcsapo/docusaurus-plugin-image-zoom/issues/26">#26</a> from chellman/docs-update-2023-06-01</li>
<li><a href="2935b60e12"><code>2935b60</code></a> Merge pull request <a href="https://redirect.github.com/gabrielcsapo/docusaurus-plugin-image-zoom/issues/30">#30</a> from gabrielcsapo/docusaurus-v3</li>
<li><a href="724625bae9"><code>724625b</code></a> Upgrading to Docusaurus v3</li>
<li><a href="aaa5d81758"><code>aaa5d81</code></a> Updated usage section in readme to reflect current usage, and the need for co...</li>
<li>See full diff in <a href="https://github.com/gabrielcsapo/docusaurus-plugin-image-zoom/compare/v1.0.1...v2.0.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=docusaurus-plugin-image-zoom&package-manager=npm_and_yarn&previous-version=1.0.1&new-version=2.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-02-02 05:48:01 +00:00
Abhishek Pandey
7e2b9dab62
Add group mailbox export (#5153)
<!-- PR description-->

* Add EML exports for group mailbox.
* Tested E2E manually along with unit tests added in this PR.
* Will follow it up with a sanity test PR.
---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-31 05:14:59 +00:00
Abhishek Pandey
1537db59c4
Convert serialized posts to eml (#5152)
<!-- PR description-->

* Add the serialized `Postable` to EML converter code for group mailboxes and associated tests.
* We should converge this with `Messageable` code at some point with a small translator function which converts `Postable` to `Messageable`. Reason being that posts are a subset of messages. Although I feel it's a bit premature right now. Once we have tested this in prod a bit, and we know for a fact that posts are indeed a true subset of messages, we can do this convergence. 

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [x] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-30 21:00:50 +00:00
Keepers
80d7d5c63d
boilerplate teamschat collection package (#5087)
seems like a lot of code, but this is 95% boilerplate additions copied from groups collections packages, with a find-replace for names.

Some noteworthy differences:
* teamsChats does not handle metadata, so all metadata, delta, and previous path handling was removed
* teamsChats does not produce tombstones
* chats are never deleted, so no "removed" items are tracked
* all chats gets stored at the prefix root, so no "containers" are iterated, and therefore only one collection is ever produced.

This means that, overall, while the boilerplate here is still the same, it's much reduced compared to similar packages.

---

#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🌻 Feature

#### Issue(s)

* #5062

#### Test Plan

- [x]  Unit test
- [x] 💚 E2E
2024-01-30 20:04:49 +00:00
ashmrtn
ca3ca60ba4
Update repo init for partial init case (#5060)
Unclear if this is precisely the path we want to take to handle possible
partial repo initialization so feel free to leave suggestions if a
different behavior would be better

Update the repo init logic to continue if the repo is marked as already existing. If it already exists then the process will try to update persistent config info in the repo. If the repo already exists but some connection credential is incorrect then stack errors such that both the reason for the connect failure and the fact that the repo already exists are both visible.

Manually tested attempting to initialize a repo that was already
initialized but using a different passphrase and got the output:
```text
Error: initializing repository: a repository was already initialized with that configuration: connecting to kopia repo: unable to create format manager: invalid repository password: repo already exists: repository already initialized
exit status 1
```

Attempting to initialize a repo again using the same passphrase resulted
in a success message with no errors reported

Unfortunately I can't really add unit tests to ensure the config ends up
the way we expect because I have no way to trigger an error partway through
kopia's init logic (or even just after it) without some non-trivial code
refactoring

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-30 18:27:35 +00:00
Abhishek Pandey
50ba30539a
Persist metadata files for group mailbox posts (#5135)
<!-- PR description-->

* Each post now has a `.data` and `.meta` file.
* I've only made changes to groups lazy reader to keep this PR short. Note that channels don't have meta/data file concepts, we don't want to accidentally enable this for channels. Given channels doesn't use lazy reader, this is safe to do short term.
* I'll be adding small follow up PRs to 1) Make sure channels files don't get assigned `.data` suffix. 2) Support for data and meta files for prefetch conversations backup. 

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-30 11:10:07 +00:00
Hitesh Pattanayak
f1406a3334
adds documentation for sharepoint lists (#5099)
adds documentation for sharepoint lists.
should go in after: https://github.com/alcionai/corso/pull/5048

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [x] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)
#4754 

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-01-30 16:05:09 +05:30
Hitesh Pattanayak
f615198c78
makes backup of lists explicit and not by default (#5165)
makes backup of lists explicit and not by default.

**Backup whole site**:
```
corso backup create sharepoint --site https://10rqc2.sharepoint.com/sites/CorsoCI

Completed Backups:
  ID                                    Started at            Duration       Status     Protected resource                           Data     
  4a324fd8-dcfc-4833-961f-316d11da34e9  2024-01-30T05:45:24Z  21.530965673s  Completed  https://10rqc2.sharepoint.com/sites/CorsoCI  Libraries
  

corso backup details sharepoint --backup 4a324fd8-dcfc-4833-961f-316d11da34e9

ID            ItemName                                     Library         ParentPath                                                                                                            Size    Owner                            Created               Modified            
  15204b287c5c  wat.jpg                                      More Documents  test                                                                                                                  6.1 kB  RFinders@10rqc2.onmicrosoft.com  2023-03-24T20:33:46Z  2023-03-24T20:33:46Z
  69f63c25d3f1  test-file.txt                                Documents       Corso_Test_od_restore_and_backup_multi_30-Jan-2024_05-06-04.380107/b                                                  387 B                                    2024-01-30T05:06:09Z  2024-01-30T05:06:09Z
  c2d55ecb218f  test-file.txt                                Documents       Corso_Test_od_restore_and_backup_multi_30-Jan-2024_05-06-04.380107/folder-a                                           131 B                                    2024-01-30T05:06:11Z  2024-01-30T05:06:12Z
  5d38274759fd  test-file.txt                                Documents       Corso_Test_od_restore_and_backup_multi_30-Jan-2024_05-06-04.380107                                                    87 B                                     2024-01-30T05:06:06Z  2024-01-30T05:06:07Z
  b5c075c00737  test-file.txt                                Documents       Corso_Test_od_restore_and_backup_multi_30-Jan-2024_05-06-04.380107/folder-a/b/folder-a                                387 B                                    2024-01-30T05:06:16Z  2024-01-30T05:06:17Z
 ...
 ...
```

**Backup lists explicitely**:
```
corso backup create sharepoint --site https://10rqc2.sharepoint.com/sites/CorsoCI --data lists
 
 Completed Backups:
  ID                                    Started at            Duration       Status     Protected resource                           Data 
  cf480086-2094-4184-82be-26e2c5c64288  2024-01-30T05:50:35Z  23.605968852s  Completed  https://10rqc2.sharepoint.com/sites/CorsoCI  Lists
 

corso backup details sharepoint --backup cf480086-2094-4184-82be-26e2c5c64288

ID            List                                                                                                                  Items  Created               Modified            
  85d453fd0f13  Corso_Test_Sanity_2024-01-30_05-31-05_e6be57ae-4236-44aa-b506-a5801377f937                                            20     2024-01-30T05:31:15Z  2024-01-30T05:31:18Z
  f8919f072c7c  Corso_Test_Sanity_Restore_20240130_053009_Corso_Test_Sanity_2024-01-30_05-29-33_4fed1e50-f045-4ebb-8a61-8fb608ac9673  20     2024-01-30T05:30:26Z  2024-01-30T05:30:29Z
  821df8ab207d  Corso_Test_Sanity_Restore_20240130_053147_Corso_Test_Sanity_2024-01-30_05-29-33_b44ce328-106c-4880-8e37-e0bbb510ed4d  20     2024-01-30T05:31:58Z  2024-01-30T05:32:02Z
  721f0aadf07f  new-list-name                                                                                                         1      2024-01-19T17:44:26Z  2024-01-29T04:05:42Z
...
...

```

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [x] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [x] 🧹 Tech Debt/Cleanup

#### Issue(s)
#4754 

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-01-30 08:11:42 +00:00
Hitesh Pattanayak
820d6aba33
enables lists restore (#5158)
enables lists restore

should go in after https://github.com/alcionai/corso/pull/5121

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [x] 🕐 Yes, but in a later PR
- [ ]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [x] 🧹 Tech Debt/Cleanup

#### Issue(s)
#4754 

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-01-30 07:32:17 +00:00
Hitesh Pattanayak
576c9f6b53
handles single & multiple values for metadata columns (#5121)
handles single & multiple values for metadata columns

Similar to `Hyperlink` and `Column` columns, `Metadata` column too is unrecognizable from GRAPH API response. Hence identifying from the field column names.

`Metadata` fields are like tags. A `Metadata` fields can be configured to hold multiple values/tags

**Original List with `Metadata` column (Department) with single value/tag**:
![Metadata-List](https://github.com/alcionai/corso/assets/48874082/0b913a2a-46d5-4d9c-83f9-69a5236b1024)

**Restored List with `Metadata` column with single value/tag**:
![Restored-Metadata-List](https://github.com/alcionai/corso/assets/48874082/9420012b-345c-4fac-90c3-c0d421b2edfb)

**Original List with `Metadata` column (Department) with multiple value/tag**:
![Metadata-List-Multi](https://github.com/alcionai/corso/assets/48874082/054ef4a1-c46e-48ba-b410-a95b540cde33)

**Restored List with `Metadata` column with multiple value/tag**:
![Restored-Multi-Metadata-List](https://github.com/alcionai/corso/assets/48874082/ef6c904b-e431-4a85-9ef2-f08bcf8e21e4)


#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)
#5084 
#5108 

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-01-30 06:35:33 +00:00
Hitesh Pattanayak
08d4803ebe
handles multiple lookup field values (#5112)
handles multiple lookup field values.

**Original `Lookup` list with single value**:
![Lookup-List-Single](https://github.com/alcionai/corso/assets/48874082/6a6b68cf-8fb9-4dfb-985e-702c4d74d3f0)

**Restored `Lookup` list with single value**:
![Restored-Lookup-List](https://github.com/alcionai/corso/assets/48874082/f97ac974-6a3b-4dd2-82c5-9f3596f9adaa)

**Original `Lookup` list with multiple values**:
![Lookup-List-Multi](https://github.com/alcionai/corso/assets/48874082/5f8b1b92-297f-4a66-b0b6-b5007d430690)

**Restored `Lookup` list with multiple values**:
![Restored-Lookup-List-Multi](https://github.com/alcionai/corso/assets/48874082/6c6d79ca-775d-4f50-abee-8090a28f3871)


#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)
#5108 
#5084 

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-01-30 05:49:00 +00:00
Hitesh Pattanayak
734fd7239e
handles multiple persons list items (#5111)
handles multiple persons list items.

**Original `person` list with single value**:
![Person-List](https://github.com/alcionai/corso/assets/48874082/a4a87cde-f907-4fc7-94da-f9ddda0f5a18)
 
**Restored `person` list with single value**:
![Restored-Person-List](https://github.com/alcionai/corso/assets/48874082/6b5c2a8b-743c-4020-9393-356d28948bf0)

**Original `person` list with multi value**:
![Person-List-Multi](https://github.com/alcionai/corso/assets/48874082/18d2c536-67ac-4b28-87be-2352764f2c95)

**Restored `person` list with multi value**:
![Restored-Person-List-Multi](https://github.com/alcionai/corso/assets/48874082/f9694e0d-d2cc-48d9-94f2-16b61c5b7cdb)

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)
#5108 
#5084 

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-01-30 04:55:03 +00:00
ashmrtn
d2f1bbb5c7
Skip ms auth checks for gock client (#5163)
Disable sending requests to microsoft servers to get API tokens when
using the gock client. This is accomplished by passing in a mock that
always returns no error for auth requests.

**This PR does not go through and make existing tests using gock unit
tests instead of integration tests. That will need to be done
separately**

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

* #5124

#### Test Plan

- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-01-30 02:41:53 +00:00
Niraj Tolia
45b021d58e
Upgrade Docusaurus to 3.1.1 (#5157)
Upgrades deps in sync

---

#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🗺️ Documentation
2024-01-29 23:43:59 +00:00
Keepers
8e6a47b103
add chats service and category to paths (#5065)
introduces the Chats service and Chats category

---

#### Does this PR need a docs update or release note?

- [x]  No

#### Type of change

- [x] 🌻 Feature

#### Issue(s)

* #5061

#### Test Plan

- [x]  Unit test
- [x] 💚 E2E
2024-01-29 22:03:31 +00:00
Hitesh Pattanayak
8ac7e6caa2
provisions to identify fields with multiple value (#5109)
provisions to identify fields with multiple value.
Some fields can hold multiple values and single value based on a property called `allowMultipleValues`.
For example, `Lookup` column:
```
"lookup": {
    "allowMultipleValues": true,
    "allowUnlimitedLength": false,
    "columnName": "Title",
    "listId": "21b45bf2-e495-4582-b114-839577ff8e4f"
}
```
But `choice` columns, even though allows to set multiple choices/value, does not have that particular field `allowMultipleValues` to indicate.
So in this PR we are trying determine the same by the stored values while restoring.

**Original list with `choice` column in site**:
![Choice-List-Multi](https://github.com/alcionai/corso/assets/48874082/d4457b3c-0230-4a69-8467-f64b7d7d4f04)

**Restored list with `choice` column in site**
![Restored-Choice-List-Multi](https://github.com/alcionai/corso/assets/48874082/78478055-0e84-43d0-ac83-262b564ce778)
The color does not come through though


#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)
#5108 

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [x]  Unit test
- [x] 💚 E2E
2024-01-29 21:27:35 +00:00
ashmrtn
6ef2c2d494
Use type switch instead of strings of types (#5156)
Instead of relying on strings pulled from the graph SDK to identify the types of the items we're dealing with use the type of the interface in golang.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [x] 🧹 Tech Debt/Cleanup

#### Issue(s)

* #5124

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-29 19:53:22 +00:00
Abin Simon
8a7a61f05d
CI to catch incorrect use of clues (#5144)
<!-- PR description-->

Catches issues where we use `WC` variant of clues when `ctx` is passed and as well as cases where we don't use WC variant when `ctx` is not passed in.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [x] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-01-29 18:26:42 +00:00
ashmrtn
e1cb5b6313
Test closing repo without connect works (#5115)
Just make sure we can call Close() on the repository struct even if we have made a connection to kopia or initialized all the internal fields.

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-01-29 17:52:03 +00:00
ashmrtn
85aaa448c5
Call config verify command during maintenance (#5123)
Add the wiring to actually call the new config verify command during
maintenance

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

merge after:
* #5117
* #5118
* #5122

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-29 16:25:25 +00:00
dependabot[bot]
8437724254
⬆️ Bump dorny/paths-filter from 2 to 3 (#5151)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 2 to 3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/dorny/paths-filter/releases">dorny/paths-filter's releases</a>.</em></p>
<blockquote>
<h2>v3.0.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Update README.md: added real world usage example by <a href="https://github.com/iamtodor"><code>@​iamtodor</code></a> in <a href="https://redirect.github.com/dorny/paths-filter/pull/178">dorny/paths-filter#178</a></li>
<li>Update Node.js to version 20 by <a href="https://github.com/danielhjacobs"><code>@​danielhjacobs</code></a> in <a href="https://redirect.github.com/dorny/paths-filter/pull/206">dorny/paths-filter#206</a></li>
<li>Update to nodejs 20 by <a href="https://github.com/dorny"><code>@​dorny</code></a> in <a href="https://redirect.github.com/dorny/paths-filter/pull/210">dorny/paths-filter#210</a></li>
<li>chore(deps): bump checkout action to v4 and use setup-node to setup node and cache npm deps by <a href="https://github.com/chenrui333"><code>@​chenrui333</code></a> in <a href="https://redirect.github.com/dorny/paths-filter/pull/211">dorny/paths-filter#211</a></li>
<li>Update all dependencies by <a href="https://github.com/dorny"><code>@​dorny</code></a> in <a href="https://redirect.github.com/dorny/paths-filter/pull/215">dorny/paths-filter#215</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/iamtodor"><code>@​iamtodor</code></a> made their first contribution in <a href="https://redirect.github.com/dorny/paths-filter/pull/178">dorny/paths-filter#178</a></li>
<li><a href="https://github.com/danielhjacobs"><code>@​danielhjacobs</code></a> made their first contribution in <a href="https://redirect.github.com/dorny/paths-filter/pull/206">dorny/paths-filter#206</a></li>
<li><a href="https://github.com/chenrui333"><code>@​chenrui333</code></a> made their first contribution in <a href="https://redirect.github.com/dorny/paths-filter/pull/211">dorny/paths-filter#211</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/dorny/paths-filter/compare/v2.11.1...v3.0.0">https://github.com/dorny/paths-filter/compare/v2.11.1...v3.0.0</a></p>
<h2>v2.11.1</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/167">Update <code>@​actions/core</code> to v1.10.0 - Fixes warning about deprecated set-output</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/168">Document need for pull-requests: read permission</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/164">Updating to actions/checkout@v3</a></li>
</ul>
<h2>v2.11.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/157">Set list-files input parameter as not required</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/161">Update Node.js</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/162">Fix incorrect handling of Unicode characters in exec()</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/163">Use Octokit pagination</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/160">Updates real world links</a></li>
</ul>
<h2>v2.10.2</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/91">Fix getLocalRef() returns wrong ref</a></li>
</ul>
<h2>v2.10.1</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/85">Improve robustness of change detection</a></li>
</ul>
<h2>v2.10.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/82">Add ref input parameter</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/83">Fix change detection in PR when <code>pullRequest.changed_files</code> is incorrect</a></li>
</ul>
<h2>v2.9.3</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/78">Fix change detection when base is a tag</a></li>
</ul>
<h2>v2.9.2</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/75">Fix fetching git history</a></li>
</ul>
<h2>v2.9.1</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/74">Fix fetching git history + fallback to unshallow repo</a></li>
</ul>
<h2>v2.9.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/68">Add list-files: csv format</a></li>
</ul>
<h2>v2.8.0</h2>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md">dorny/paths-filter's changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<h2>v3.0.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/210">Update to Node.js 20 </a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/215">Update all dependencies</a></li>
</ul>
<h2>v2.11.1</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/167">Update <code>@​actions/core</code> to v1.10.0 - Fixes warning about deprecated set-output</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/168">Document need for pull-requests: read permission</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/164">Updating to actions/checkout@v3</a></li>
</ul>
<h2>v2.11.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/157">Set list-files input parameter as not required</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/161">Update Node.js</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/162">Fix incorrect handling of Unicode characters in exec()</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/163">Use Octokit pagination</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/160">Updates real world links</a></li>
</ul>
<h2>v2.10.2</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/91">Fix getLocalRef() returns wrong ref</a></li>
</ul>
<h2>v2.10.1</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/85">Improve robustness of change detection</a></li>
</ul>
<h2>v2.10.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/82">Add ref input parameter</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/83">Fix change detection in PR when pullRequest.changed_files is incorrect</a></li>
</ul>
<h2>v2.9.3</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/78">Fix change detection when base is a tag</a></li>
</ul>
<h2>v2.9.2</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/75">Fix fetching git history</a></li>
</ul>
<h2>v2.9.1</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/74">Fix fetching git history + fallback to unshallow repo</a></li>
</ul>
<h2>v2.9.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/68">Add list-files: csv format</a></li>
</ul>
<h2>v2.8.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/65">Add count output variable</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/61">Fix log grouping of changes</a></li>
</ul>
<h2>v2.7.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/59">Add &quot;changes&quot; output variable to support matrix job configuration</a></li>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/58">Improved listing of matching files with <code>list-files: shell</code> and <code>list-files: escape</code> options</a></li>
</ul>
<h2>v2.6.0</h2>
<ul>
<li><a href="https://redirect.github.com/dorny/paths-filter/pull/53">Support local changes</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="0bc4621a31"><code>0bc4621</code></a> Bump major version to v3</li>
<li><a href="7267a8516b"><code>7267a85</code></a> Update CHANGELOG for v2.12.0</li>
<li><a href="e36f1124bf"><code>e36f112</code></a> Merge pull request <a href="https://redirect.github.com/dorny/paths-filter/issues/215">#215</a> from dorny/update-dependencies</li>
<li><a href="2f74457227"><code>2f74457</code></a> Update all dependencies</li>
<li><a href="67617953b4"><code>6761795</code></a> Update examples in README  to use checkout@v4</li>
<li><a href="a35d8d6a33"><code>a35d8d6</code></a> Merge pull request <a href="https://redirect.github.com/dorny/paths-filter/issues/211">#211</a> from chenrui333/node-20</li>
<li><a href="b5a5203f8b"><code>b5a5203</code></a> chore(deps): bump checkout action to v4 and use setup-node to setup node and ...</li>
<li><a href="3c49e64ca2"><code>3c49e64</code></a> Merge pull request <a href="https://redirect.github.com/dorny/paths-filter/issues/210">#210</a> from dorny/use-nodejs-20</li>
<li><a href="8ec7be4734"><code>8ec7be4</code></a> Update to nodejs 20</li>
<li><a href="100a1198b2"><code>100a119</code></a> Revert back to node16</li>
<li>Additional commits viewable in <a href="https://github.com/dorny/paths-filter/compare/v2...v3">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=dorny/paths-filter&package-manager=github_actions&previous-version=2&new-version=3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-01-29 11:06:45 +00:00
dependabot[bot]
08e1b1d1e6
⬆️ Bump github.com/arran4/golang-ical from 0.2.3 to 0.2.4 in /src (#5145)
Bumps [github.com/arran4/golang-ical](https://github.com/arran4/golang-ical) from 0.2.3 to 0.2.4.
<details>
<summary>Commits</summary>
<ul>
<li><a href="51fa6f1213"><code>51fa6f1</code></a> Merge pull request <a href="https://redirect.github.com/arran4/golang-ical/issues/85">#85</a> from brackendawson/panic</li>
<li><a href="46e2a5c0ed"><code>46e2a5c</code></a> Exclude fuzz testing from pre-1.18 toolchains</li>
<li><a href="a8f0586c90"><code>a8f0586</code></a> fix panic when param value has incomplete escape sequence</li>
<li><a href="3631125a31"><code>3631125</code></a> fix panic when property ends without colon or value</li>
<li><a href="681cc6e62c"><code>681cc6e</code></a> fix panic for missing property param operator</li>
<li><a href="804f4d3436"><code>804f4d3</code></a> Merge pull request <a href="https://redirect.github.com/arran4/golang-ical/issues/84">#84</a> from arran4/pull-66-add-todo-journal</li>
<li><a href="cf8d1b371d"><code>cf8d1b3</code></a> Ignore explicitly</li>
<li><a href="1e96c15957"><code>1e96c15</code></a> Consistent receivers.</li>
<li><a href="9e30bdf5f0"><code>9e30bdf</code></a> Ignore explicitly</li>
<li><a href="55df13ec27"><code>55df13e</code></a> Consistent todo receiver</li>
<li>Additional commits viewable in <a href="https://github.com/arran4/golang-ical/compare/v0.2.3...v0.2.4">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/arran4/golang-ical&package-manager=go_modules&previous-version=0.2.3&new-version=0.2.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-01-29 10:28:48 +00:00
dependabot[bot]
41f2808bd9
⬆️ Bump jarallax from 2.1.4 to 2.2.0 in /website (#5147)
Bumps [jarallax](https://github.com/nk-o/jarallax) from 2.1.4 to 2.2.0.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/nk-o/jarallax/blob/master/CHANGELOG.md">jarallax's changelog</a>.</em></p>
<blockquote>
<h2>[2.2.0] - Jan 27, 2024</h2>
<ul>
<li>updated video worker:
<ul>
<li>added support for private Vimeo videos hash in URL</li>
<li>fixed <code>play</code> method play when <code>endTime</code> reached</li>
</ul>
</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="f8060f98d1"><code>f8060f9</code></a> v2.2.0</li>
<li><a href="2df482e478"><code>2df482e</code></a> updated video-worker</li>
<li><a href="afa456f38b"><code>afa456f</code></a> Update npm-publish.yml</li>
<li>See full diff in <a href="https://github.com/nk-o/jarallax/compare/v2.1.4...v2.2.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=jarallax&package-manager=npm_and_yarn&previous-version=2.1.4&new-version=2.2.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting `@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>
2024-01-29 05:25:28 +00:00
Abin Simon
79194c44df
Fix event cancellation status in export (#5107)
<!-- PR description-->

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-29 04:38:09 +00:00
ashmrtn
5f036a0cc1
Use fault bus and alerts instead of error for config verify (#5122)
Switch to using alerts and the fault bus instead of errors. Hopefully
this will make it easier to ensure this verify code doesn't fail
maintenance overall and that callers can consume the info in a
standardized manner

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

merge after:
* #5117
* #5118

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-27 04:04:12 +00:00
ashmrtn
d426250931
Validate kopia config (#5118)
Add function that returns errors if it finds issues with common config
info in the kopia repo. Parameters that are currently checked are:

* kopia global policy:
    * kopia snapshot retention is disabled
    * kopia compression matches the default compression for corso
    * kopia scheduling is disabled
* object locking:
    * maintenance and blob config blob parameters are consistent (i.e. all
    enabled or all disabled)

Note that tests for this will fail until alcionai/clues#40 is
merged and clues is updated in corso

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [x] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [x] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Issue(s)

merge after:
* #5117
* alcionai/clues#40

#### Test Plan

- [ ] 💪 Manual
- [x]  Unit test
- [ ] 💚 E2E
2024-01-27 01:53:11 +00:00
ashmrtn
c3f4dd6bcf
Pull in latest version of clues (#5132)
Latest clues patch fixes bug where the result of `clues.Stack(clues.Stack(nil))` was
not considered nil

---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

- [ ] 🌻 Feature
- [x] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [ ] 🧹 Tech Debt/Cleanup

#### Test Plan

- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-01-27 00:49:11 +00:00
Keepers
f7a9ca836f
flush progress bars before all print calls (#5131)
hopefully fixes clobbering issues for end-of-run tables

---

#### Does this PR need a docs update or release note?

- [x]  Yes, it's included

#### Type of change

- [x] 🐛 Bugfix

#### Issue(s)

* #5113

#### Test Plan

- [x] 💪 Manual
2024-01-27 00:10:10 +00:00
Abhishek Pandey
8133da3087
Centralize metadata suffix definitions (#5134)
<!-- PR description-->

Centralizing the suffixes so that group mailbox and future services can also use them.

* Move `.meta`, `.data`, `.dirmeta` definitions from `internal/m365/collection/drive/metadata` pkg to `pkg/services/m365/api/graph/metadata`.
* Adjust package names in misc places.
* No logic change. 
* We should probably keep all metadata definitions and code in one place, i.e. relocate drive metadata definitions to `pkg/services/m365/api/graph/metadata`. This is a bigger change, we can do it later.
---

#### Does this PR need a docs update or release note?

- [ ]  Yes, it's included
- [ ] 🕐 Yes, but in a later PR
- [x]  No

#### Type of change

<!--- Please check the type of change your PR introduces: --->
- [ ] 🌻 Feature
- [ ] 🐛 Bugfix
- [ ] 🗺️ Documentation
- [ ] 🤖 Supportability/Tests
- [ ] 💻 CI/Deployment
- [x] 🧹 Tech Debt/Cleanup

#### Issue(s)

<!-- Can reference multiple issues. Use one of the following "magic words" - "closes, fixes" to auto-close the Github issue. -->
* #<issue>

#### Test Plan

<!-- How will this be tested prior to merging.-->
- [x] 💪 Manual
- [ ]  Unit test
- [ ] 💚 E2E
2024-01-26 22:06:04 +00:00
285 changed files with 19849 additions and 3395 deletions

View File

@ -1,4 +1,5 @@
name: Backup Restore Test name: Backup Restore Test
description: Run various backup/restore/export tests for a service.
inputs: inputs:
service: service:

View File

@ -1,4 +1,5 @@
name: Setup and Cache Golang name: Setup and Cache Golang
description: Build golang binaries for later use in CI.
# clone of: https://github.com/magnetikonline/action-golang-cache/blob/main/action.yaml # clone of: https://github.com/magnetikonline/action-golang-cache/blob/main/action.yaml
# #

View File

@ -1,4 +1,5 @@
name: Publish Binary name: Publish Binary
description: Publish binary artifacts.
inputs: inputs:
version: version:

View File

@ -1,4 +1,5 @@
name: Publish Website name: Publish Website
description: Publish website artifacts.
inputs: inputs:
aws-iam-role: aws-iam-role:

View File

@ -1,4 +1,5 @@
name: Purge M365 User Data name: Purge M365 User Data
description: Deletes M365 data generated during CI tests.
# Hard deletion of an m365 user's data. Our CI processes create a lot # Hard deletion of an m365 user's data. Our CI processes create a lot
# of data churn (creation and immediate deletion) of files, the likes # of data churn (creation and immediate deletion) of files, the likes
@ -30,12 +31,19 @@ inputs:
description: Secret value of for AZURE_CLIENT_ID description: Secret value of for AZURE_CLIENT_ID
azure-client-secret: azure-client-secret:
description: Secret value of for AZURE_CLIENT_SECRET description: Secret value of for AZURE_CLIENT_SECRET
azure-pnp-client-id:
description: Secret value of AZURE_PNP_CLIENT_ID
azure-pnp-client-cert:
description: Base64 encoded private certificate for the azure-pnp-client-id (Secret value of AZURE_PNP_CLIENT_CERT)
azure-tenant-id: azure-tenant-id:
description: Secret value of for AZURE_TENANT_ID description: Secret value of AZURE_TENANT_ID
m365-admin-user: m365-admin-user:
description: Secret value of for M365_TENANT_ADMIN_USER description: Secret value of for M365_TENANT_ADMIN_USER
m365-admin-password: m365-admin-password:
description: Secret value of for M365_TENANT_ADMIN_PASSWORD description: Secret value of for M365_TENANT_ADMIN_PASSWORD
tenant-domain:
description: The domain of the tenant (ex. 10rqc2.onmicrosft.com)
required: true
runs: runs:
using: composite using: composite
@ -53,7 +61,13 @@ runs:
AZURE_CLIENT_ID: ${{ inputs.azure-client-id }} AZURE_CLIENT_ID: ${{ inputs.azure-client-id }}
AZURE_CLIENT_SECRET: ${{ inputs.azure-client-secret }} AZURE_CLIENT_SECRET: ${{ inputs.azure-client-secret }}
AZURE_TENANT_ID: ${{ inputs.azure-tenant-id }} AZURE_TENANT_ID: ${{ inputs.azure-tenant-id }}
run: ./exchangePurge.ps1 -User ${{ inputs.user }} -FolderNamePurgeList PersonMetadata -FolderPrefixPurgeList "${{ inputs.folder-prefix }}".Split(",") -PurgeBeforeTimestamp ${{ inputs.older-than }} run: |
for ($ATTEMPT_NUM = 1; $ATTEMPT_NUM -le 3; $ATTEMPT_NUM++)
{
if (./exchangePurge.ps1 -User ${{ inputs.user }} -FolderNamePurgeList PersonMetadata -FolderPrefixPurgeList "${{ inputs.folder-prefix }}".Split(",") -PurgeBeforeTimestamp ${{ inputs.older-than }}) {
break
}
}
# TODO(ashmrtn): Re-enable when we figure out errors we're seeing with Get-Mailbox call. # TODO(ashmrtn): Re-enable when we figure out errors we're seeing with Get-Mailbox call.
#- name: Reset retention for all mailboxes to 0 #- name: Reset retention for all mailboxes to 0
@ -74,10 +88,16 @@ runs:
shell: pwsh shell: pwsh
working-directory: ./src/cmd/purge/scripts working-directory: ./src/cmd/purge/scripts
env: env:
M365_TENANT_ADMIN_USER: ${{ inputs.m365-admin-user }} AZURE_CLIENT_ID: ${{ inputs.azure-pnp-client-id }}
M365_TENANT_ADMIN_PASSWORD: ${{ inputs.m365-admin-password }} AZURE_APP_CERT: ${{ inputs.azure-pnp-client-cert }}
TENANT_DOMAIN: ${{ inputs.tenant-domain }}
run: | run: |
./onedrivePurge.ps1 -User ${{ inputs.user }} -FolderPrefixPurgeList "${{ inputs.folder-prefix }}".Split(",") -PurgeBeforeTimestamp ${{ inputs.older-than }} for ($ATTEMPT_NUM = 1; $ATTEMPT_NUM -le 3; $ATTEMPT_NUM++)
{
if (./onedrivePurge.ps1 -User ${{ inputs.user }} -FolderPrefixPurgeList "${{ inputs.folder-prefix }}".Split(",") -PurgeBeforeTimestamp ${{ inputs.older-than }}) {
break
}
}
################################################################################################################ ################################################################################################################
# Sharepoint # Sharepoint
@ -88,6 +108,14 @@ runs:
shell: pwsh shell: pwsh
working-directory: ./src/cmd/purge/scripts working-directory: ./src/cmd/purge/scripts
env: env:
M365_TENANT_ADMIN_USER: ${{ inputs.m365-admin-user }} AZURE_CLIENT_ID: ${{ inputs.azure-pnp-client-id }}
M365_TENANT_ADMIN_PASSWORD: ${{ inputs.m365-admin-password }} AZURE_APP_CERT: ${{ inputs.azure-pnp-client-cert }}
run: ./onedrivePurge.ps1 -Site ${{ inputs.site }} -LibraryNameList "${{ inputs.libraries }}".split(",") -FolderPrefixPurgeList ${{ inputs.folder-prefix }} -LibraryPrefixDeleteList ${{ inputs.library-prefix && inputs.library-prefix || '[]' }} -PurgeBeforeTimestamp ${{ inputs.older-than }} TENANT_DOMAIN: ${{ inputs.tenant-domain }}
run: |
for ($ATTEMPT_NUM = 1; $ATTEMPT_NUM -le 3; $ATTEMPT_NUM++)
{
if (./onedrivePurge.ps1 -Site ${{ inputs.site }} -LibraryNameList "${{ inputs.libraries }}".split(",") -FolderPrefixPurgeList ${{ inputs.folder-prefix }} -LibraryPrefixDeleteList ${{ inputs.library-prefix && inputs.library-prefix || '[]' }} -PurgeBeforeTimestamp ${{ inputs.older-than }}) {
break
}
}

View File

@ -1,4 +1,5 @@
name: Send a message to Teams name: Send a message to Teams
description: Send messages to communication apps.
inputs: inputs:
msg: msg:

View File

@ -1,4 +1,5 @@
name: Lint Website name: Lint Website
description: Lint website content.
inputs: inputs:
version: version:

View File

@ -28,7 +28,7 @@ jobs:
# only run CI tests if the src folder or workflow actions have changed # only run CI tests if the src folder or workflow actions have changed
- name: Check for file changes in src/ or .github/workflows/ - name: Check for file changes in src/ or .github/workflows/
uses: dorny/paths-filter@v2 uses: dorny/paths-filter@v3
id: dornycheck id: dornycheck
with: with:
list-files: json list-files: json

View File

@ -40,5 +40,5 @@ jobs:
if: failure() if: failure()
uses: ./.github/actions/teams-message uses: ./.github/actions/teams-message
with: with:
msg: "[FAILED] Publishing Binary" msg: "[CORSO FAILED] Publishing Binary"
teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }} teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }}

View File

@ -463,7 +463,7 @@ jobs:
go-version-file: src/go.mod go-version-file: src/go.mod
- name: Go Lint - name: Go Lint
uses: golangci/golangci-lint-action@v3 uses: golangci/golangci-lint-action@v4
with: with:
# Keep pinned to a verson as sometimes updates will add new lint # Keep pinned to a verson as sometimes updates will add new lint
# failures in unchanged code. # failures in unchanged code.
@ -518,6 +518,20 @@ jobs:
echo "Make sure to propagate errors with clues" echo "Make sure to propagate errors with clues"
exit 1 exit 1
fi fi
- name: Check if clues without context are used when context is passed in
run: |
# Using `grep .` as the exit codes are always true for correct grammar
if tree-grepper -q go '((function_declaration (parameter_list . (parameter_declaration (identifier) @_octx)) body: (block (short_var_declaration left: (expression_list (identifier) @_err . ) right: (expression_list (call_expression (argument_list . (identifier) @_ctx)))) . (if_statement (binary_expression) @_exp consequence: (block (return_statement (expression_list (call_expression (selector_expression (call_expression (selector_expression) @clue))) . )))))) (#eq? @_err "err") (#eq? @_octx "ctx") (#eq? @_ctx "ctx") (#eq? @_exp "err != nil") (#match? @clue "^clues\.") (#match? @clue "WC$"))' | grep .; then
echo "Do not use clues.*WC when context is passed in"
exit 1
fi
- name: Check clues with context is used when context is not passed in
run: |
# Using `grep .` as the exit codes are always true for correct grammar
if tree-grepper -q go '((function_declaration (parameter_list . (parameter_declaration (identifier) @_octx)) body: (block (short_var_declaration left: (expression_list (identifier) @_err . ) right: (expression_list (call_expression (argument_list . (identifier) @_ctx)))) . (if_statement (binary_expression) @_exp consequence: (block (return_statement (expression_list (call_expression (selector_expression (call_expression (selector_expression) @clue))) . )))))) (#eq? @_err "err") (#eq? @_octx "ctx") (#not-eq? @_ctx "ctx") (#eq? @_exp "err != nil") (#match? @clue "^clues\.") (#not-match? @clue "WC$"))' | grep .; then
echo "Use clues.*WC when context is not passed in"
exit 1
fi
# ---------------------------------------------------------------------------------------------------- # ----------------------------------------------------------------------------------------------------
# --- GitHub Actions Linting ------------------------------------------------------------------------- # --- GitHub Actions Linting -------------------------------------------------------------------------

View File

@ -12,7 +12,7 @@ jobs:
continue-on-error: true continue-on-error: true
strategy: strategy:
matrix: matrix:
user: [ CORSO_M365_TEST_USER_ID, CORSO_SECONDARY_M365_TEST_USER_ID, '' ] user: [CORSO_M365_TEST_USER_ID, CORSO_SECONDARY_M365_TEST_USER_ID, ""]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
@ -33,12 +33,15 @@ jobs:
azure-tenant-id: ${{ secrets.TENANT_ID }} azure-tenant-id: ${{ secrets.TENANT_ID }}
m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }} m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }}
m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }} m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }}
azure-pnp-client-id: ${{ secrets.AZURE_PNP_CLIENT_ID }}
azure-pnp-client-cert: ${{ secrets.AZURE_PNP_CLIENT_CERT }}
tenant-domain: ${{ vars.TENANT_DOMAIN }}
- name: Notify failure in teams - name: Notify failure in teams
if: failure() if: failure()
uses: ./.github/actions/teams-message uses: ./.github/actions/teams-message
with: with:
msg: "[FAILED] ${{ vars[matrix.user] }} CI Cleanup" msg: "[CORSO FAILED] ${{ vars[matrix.user] }} CI Cleanup"
teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }} teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }}
Test-Site-Data-Cleanup: Test-Site-Data-Cleanup:
@ -47,7 +50,7 @@ jobs:
continue-on-error: true continue-on-error: true
strategy: strategy:
matrix: matrix:
site: [ CORSO_M365_TEST_SITE_URL, CORSO_M365_TEST_GROUPS_SITE_URL ] site: [CORSO_M365_TEST_SITE_URL, CORSO_M365_TEST_GROUPS_SITE_URL]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
@ -70,10 +73,13 @@ jobs:
azure-tenant-id: ${{ secrets.TENANT_ID }} azure-tenant-id: ${{ secrets.TENANT_ID }}
m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }} m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }}
m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }} m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }}
azure-pnp-client-id: ${{ secrets.AZURE_PNP_CLIENT_ID }}
azure-pnp-client-cert: ${{ secrets.AZURE_PNP_CLIENT_CERT }}
tenant-domain: ${{ vars.TENANT_DOMAIN }}
- name: Notify failure in teams - name: Notify failure in teams
if: failure() if: failure()
uses: ./.github/actions/teams-message uses: ./.github/actions/teams-message
with: with:
msg: "[FAILED] ${{ vars[matrix.site] }} CI Cleanup" msg: "[CORSO FAILED] ${{ vars[matrix.site] }} CI Cleanup"
teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }} teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }}

View File

@ -155,3 +155,6 @@ jobs:
azure-tenant-id: ${{ secrets.TENANT_ID }} azure-tenant-id: ${{ secrets.TENANT_ID }}
m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }} m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }}
m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }} m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }}
azure-pnp-client-id: ${{ secrets.AZURE_PNP_CLIENT_ID }}
azure-pnp-client-cert: ${{ secrets.AZURE_PNP_CLIENT_CERT }}
tenant-domain: ${{ vars.TENANT_DOMAIN }}

View File

@ -6,7 +6,7 @@ on:
workflow_dispatch: workflow_dispatch:
inputs: inputs:
user: user:
description: 'User to run longevity test on' description: "User to run longevity test on"
permissions: permissions:
# required to retrieve AWS credentials # required to retrieve AWS credentials
@ -23,7 +23,7 @@ jobs:
uses: alcionai/corso/.github/workflows/accSelector.yaml@main uses: alcionai/corso/.github/workflows/accSelector.yaml@main
Longevity-Tests: Longevity-Tests:
needs: [ SetM365App ] needs: [SetM365App]
environment: Testing environment: Testing
runs-on: ubuntu-latest runs-on: ubuntu-latest
env: env:
@ -37,7 +37,7 @@ jobs:
CORSO_LOG_FILE: ${{ github.workspace }}/src/testlog/run-longevity.log CORSO_LOG_FILE: ${{ github.workspace }}/src/testlog/run-longevity.log
RESTORE_DEST_PFX: Corso_Test_Longevity_ RESTORE_DEST_PFX: Corso_Test_Longevity_
TEST_USER: ${{ github.event.inputs.user != '' && github.event.inputs.user || vars.CORSO_M365_TEST_USER_ID }} TEST_USER: ${{ github.event.inputs.user != '' && github.event.inputs.user || vars.CORSO_M365_TEST_USER_ID }}
PREFIX: 'longevity' PREFIX: "longevity"
# Options for retention. # Options for retention.
RETENTION_MODE: GOVERNANCE RETENTION_MODE: GOVERNANCE
@ -113,7 +113,6 @@ jobs:
--extend-retention \ --extend-retention \
--prefix ${{ env.PREFIX }} \ --prefix ${{ env.PREFIX }} \
--bucket ${{ secrets.CI_RETENTION_TESTS_S3_BUCKET }} \ --bucket ${{ secrets.CI_RETENTION_TESTS_S3_BUCKET }} \
--succeed-if-exists \
2>&1 | tee ${{ env.CORSO_LOG_DIR }}/gotest-repo-init.log 2>&1 | tee ${{ env.CORSO_LOG_DIR }}/gotest-repo-init.log
if grep -q 'Failed to' ${{ env.CORSO_LOG_DIR }}/gotest-repo-init.log if grep -q 'Failed to' ${{ env.CORSO_LOG_DIR }}/gotest-repo-init.log
@ -393,5 +392,5 @@ jobs:
if: failure() if: failure()
uses: ./.github/actions/teams-message uses: ./.github/actions/teams-message
with: with:
msg: "[FAILED] Longevity Test" msg: "[CORSO FAILED] Longevity Test"
teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }} teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }}

View File

@ -48,7 +48,7 @@ jobs:
# ---------------------------------------------------------------------------------------------------- # ----------------------------------------------------------------------------------------------------
Test-Suite-Trusted: Test-Suite-Trusted:
needs: [ Checkout, SetM365App] needs: [Checkout, SetM365App]
environment: Testing environment: Testing
runs-on: ubuntu-latest runs-on: ubuntu-latest
defaults: defaults:
@ -100,9 +100,9 @@ jobs:
-timeout 2h \ -timeout 2h \
./... 2>&1 | tee ./testlog/gotest-nightly.log | gotestfmt -hide successful-tests ./... 2>&1 | tee ./testlog/gotest-nightly.log | gotestfmt -hide successful-tests
########################################################################################################################################## ##########################################################################################################################################
# Logging & Notifications # Logging & Notifications
# Upload the original go test output as an artifact for later review. # Upload the original go test output as an artifact for later review.
- name: Upload test log - name: Upload test log
@ -118,5 +118,5 @@ jobs:
if: failure() if: failure()
uses: ./.github/actions/teams-message uses: ./.github/actions/teams-message
with: with:
msg: "[FAILED] Nightly Checks" msg: "[COROS FAILED] Nightly Checks"
teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }} teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }}

View File

@ -19,7 +19,7 @@ jobs:
private_key: ${{ secrets.PRIVATE_KEY }} private_key: ${{ secrets.PRIVATE_KEY }}
- name: Slash Command Dispatch - name: Slash Command Dispatch
uses: peter-evans/slash-command-dispatch@v3 uses: peter-evans/slash-command-dispatch@v4
env: env:
TOKEN: ${{ steps.generate_token.outputs.token }} TOKEN: ${{ steps.generate_token.outputs.token }}
with: with:

View File

@ -6,7 +6,7 @@ on:
workflow_dispatch: workflow_dispatch:
inputs: inputs:
user: user:
description: 'User to run sanity test on' description: "User to run sanity test on"
permissions: permissions:
# required to retrieve AWS credentials # required to retrieve AWS credentials
@ -23,7 +23,7 @@ jobs:
uses: alcionai/corso/.github/workflows/accSelector.yaml@main uses: alcionai/corso/.github/workflows/accSelector.yaml@main
Sanity-Tests: Sanity-Tests:
needs: [ SetM365App ] needs: [SetM365App]
environment: Testing environment: Testing
runs-on: ubuntu-latest runs-on: ubuntu-latest
env: env:
@ -44,11 +44,10 @@ jobs:
run: run:
working-directory: src working-directory: src
########################################################################################################################################## ##########################################################################################################################################
# setup # setup
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Setup Golang with cache - name: Setup Golang with cache
@ -64,9 +63,9 @@ jobs:
- run: mkdir ${CORSO_LOG_DIR} - run: mkdir ${CORSO_LOG_DIR}
########################################################################################################################################## ##########################################################################################################################################
# Pre-Run cleanup # Pre-Run cleanup
# unlike CI tests, sanity tests are not expected to run concurrently. # unlike CI tests, sanity tests are not expected to run concurrently.
# however, the sanity yaml concurrency is set to a maximum of 1 run, preferring # however, the sanity yaml concurrency is set to a maximum of 1 run, preferring
@ -91,6 +90,9 @@ jobs:
azure-tenant-id: ${{ secrets.TENANT_ID }} azure-tenant-id: ${{ secrets.TENANT_ID }}
m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }} m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }}
m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }} m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }}
azure-pnp-client-id: ${{ secrets.AZURE_PNP_CLIENT_ID }}
azure-pnp-client-cert: ${{ secrets.AZURE_PNP_CLIENT_CERT }}
tenant-domain: ${{ vars.TENANT_DOMAIN }}
- name: Purge CI-Produced Folders for Sites - name: Purge CI-Produced Folders for Sites
timeout-minutes: 30 timeout-minutes: 30
@ -106,10 +108,13 @@ jobs:
azure-tenant-id: ${{ secrets.TENANT_ID }} azure-tenant-id: ${{ secrets.TENANT_ID }}
m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }} m365-admin-user: ${{ secrets.M365_TENANT_ADMIN_USER }}
m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }} m365-admin-password: ${{ secrets.M365_TENANT_ADMIN_PASSWORD }}
azure-pnp-client-id: ${{ secrets.AZURE_PNP_CLIENT_ID }}
azure-pnp-client-cert: ${{ secrets.AZURE_PNP_CLIENT_CERT }}
tenant-domain: ${{ vars.TENANT_DOMAIN }}
########################################################################################################################################## ##########################################################################################################################################
# Repository commands # Repository commands
- name: Version Test - name: Version Test
timeout-minutes: 10 timeout-minutes: 10
@ -169,9 +174,9 @@ jobs:
--mode complete \ --mode complete \
2>&1 | tee ${{ env.CORSO_LOG_DIR }}/gotest-repo-maintenance.log 2>&1 | tee ${{ env.CORSO_LOG_DIR }}/gotest-repo-maintenance.log
########################################################################################################################################## ##########################################################################################################################################
# Exchange # Exchange
# generate new entries to roll into the next load test # generate new entries to roll into the next load test
# only runs if the test was successful # only runs if the test was successful
@ -193,8 +198,8 @@ jobs:
service: exchange service: exchange
kind: first-backup kind: first-backup
backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email"' backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email"'
restore-args: '--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-args: "--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
@ -206,8 +211,8 @@ jobs:
service: exchange service: exchange
kind: incremental kind: incremental
backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email"' backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email"'
restore-args: '--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-args: "--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
backup-id: ${{ steps.exchange-backup.outputs.backup-id }} backup-id: ${{ steps.exchange-backup.outputs.backup-id }}
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
@ -220,8 +225,8 @@ jobs:
service: exchange service: exchange
kind: non-delta kind: non-delta
backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email" --disable-delta' backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email" --disable-delta'
restore-args: '--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-args: "--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
backup-id: ${{ steps.exchange-backup.outputs.backup-id }} backup-id: ${{ steps.exchange-backup.outputs.backup-id }}
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
@ -234,16 +239,15 @@ jobs:
service: exchange service: exchange
kind: non-delta-incremental kind: non-delta-incremental
backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email"' backup-args: '--mailbox "${{ env.TEST_USER }}" --data "email"'
restore-args: '--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-args: "--email-folder ${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.repo-init.outputs.result }}"
backup-id: ${{ steps.exchange-backup.outputs.backup-id }} backup-id: ${{ steps.exchange-backup.outputs.backup-id }}
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
##########################################################################################################################################
########################################################################################################################################## # Onedrive
# Onedrive
# generate new entries for test # generate new entries for test
- name: OneDrive - Create new data - name: OneDrive - Create new data
@ -270,8 +274,8 @@ jobs:
service: onedrive service: onedrive
kind: first-backup kind: first-backup
backup-args: '--user "${{ env.TEST_USER }}"' backup-args: '--user "${{ env.TEST_USER }}"'
restore-args: '--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}' restore-args: "--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
@ -295,14 +299,14 @@ jobs:
service: onedrive service: onedrive
kind: incremental kind: incremental
backup-args: '--user "${{ env.TEST_USER }}"' backup-args: '--user "${{ env.TEST_USER }}"'
restore-args: '--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}' restore-args: "--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-onedrive.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
########################################################################################################################################## ##########################################################################################################################################
# Sharepoint Library # Sharepoint Library
# generate new entries for test # generate new entries for test
- name: SharePoint - Create new data - name: SharePoint - Create new data
@ -330,8 +334,8 @@ jobs:
service: sharepoint service: sharepoint
kind: first-backup kind: first-backup
backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data libraries' backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data libraries'
restore-args: '--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}' restore-args: "--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
category: libraries category: libraries
@ -357,15 +361,15 @@ jobs:
service: sharepoint service: sharepoint
kind: incremental kind: incremental
backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data libraries' backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data libraries'
restore-args: '--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}' restore-args: "--folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}"
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-sharepoint.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
category: libraries category: libraries
########################################################################################################################################## ##########################################################################################################################################
# Sharepoint Lists # Sharepoint Lists
# generate new entries for test # generate new entries for test
# The `awk | tr | sed` command chain is used to get a comma separated list of SharePoint list names. # The `awk | tr | sed` command chain is used to get a comma separated list of SharePoint list names.
@ -403,7 +407,7 @@ jobs:
service: sharepoint service: sharepoint
kind: first-backup-lists kind: first-backup-lists
backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data lists' backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data lists'
restore-args: "--list ${{ steps.new-data-creation-sharepoint-lists.outputs.result }} --destination Corso_Test_Sanity_Restore_$(date +'%Y%m%d_%H%M%S') --allow-lists-restore" restore-args: "--list ${{ steps.new-data-creation-sharepoint-lists.outputs.result }} --destination Corso_Test_Sanity_Restore_$(date +'%Y%m%d_%H%M%S')"
export-args: "--list ${{ steps.new-data-creation-sharepoint-lists.outputs.result }}" export-args: "--list ${{ steps.new-data-creation-sharepoint-lists.outputs.result }}"
restore-container: "${{ steps.sharepoint-lists-store-restore-container.outputs.result }}" restore-container: "${{ steps.sharepoint-lists-store-restore-container.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
@ -446,7 +450,7 @@ jobs:
service: sharepoint service: sharepoint
kind: incremental-lists kind: incremental-lists
backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data lists' backup-args: '--site "${{ vars.CORSO_M365_TEST_SITE_URL }}" --data lists'
restore-args: "--list ${{ steps.inc-data-creation-sharepoint-lists.outputs.result }},${{ steps.new-data-creation-sharepoint-lists.outputs.result }} --destination Corso_Test_Sanity_Restore_$(date +'%Y%m%d_%H%M%S') --allow-lists-restore" restore-args: "--list ${{ steps.inc-data-creation-sharepoint-lists.outputs.result }},${{ steps.new-data-creation-sharepoint-lists.outputs.result }} --destination Corso_Test_Sanity_Restore_$(date +'%Y%m%d_%H%M%S')"
export-args: "--list ${{ steps.inc-data-creation-sharepoint-lists.outputs.result }},${{ steps.new-data-creation-sharepoint-lists.outputs.result }}" export-args: "--list ${{ steps.inc-data-creation-sharepoint-lists.outputs.result }},${{ steps.new-data-creation-sharepoint-lists.outputs.result }}"
restore-container: "${{ steps.sharepoint-lists-store-restore-container-inc.outputs.result }},${{ steps.sharepoint-lists-store-restore-container.outputs.result }}" restore-container: "${{ steps.sharepoint-lists-store-restore-container-inc.outputs.result }},${{ steps.sharepoint-lists-store-restore-container.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
@ -454,9 +458,9 @@ jobs:
category: lists category: lists
on-collision: copy on-collision: copy
########################################################################################################################################## ##########################################################################################################################################
# Groups and Teams # Groups and Teams
# generate new entries for test # generate new entries for test
- name: Groups - Create new data - name: Groups - Create new data
@ -483,8 +487,8 @@ jobs:
with: with:
service: groups service: groups
kind: first-backup kind: first-backup
backup-args: '--group "${{ vars.CORSO_M365_TEST_TEAM_ID }}"' backup-args: '--group "${{ vars.CORSO_M365_TEST_TEAM_ID }}" --data messages,libraries'
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-groups.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-groups.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
@ -508,15 +512,15 @@ jobs:
with: with:
service: groups service: groups
kind: incremental kind: incremental
backup-args: '--group "${{ vars.CORSO_M365_TEST_TEAM_ID }}"' backup-args: '--group "${{ vars.CORSO_M365_TEST_TEAM_ID }}" --data messages,libraries'
restore-args: '--site "${{ vars.CORSO_M365_TEST_GROUPS_SITE_URL }}" --folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-groups.outputs.result }}' restore-args: '--site "${{ vars.CORSO_M365_TEST_GROUPS_SITE_URL }}" --folder ${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-groups.outputs.result }}'
restore-container: '${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-groups.outputs.result }}' restore-container: "${{ env.RESTORE_DEST_PFX }}${{ steps.new-data-creation-groups.outputs.result }}"
log-dir: ${{ env.CORSO_LOG_DIR }} log-dir: ${{ env.CORSO_LOG_DIR }}
with-export: true with-export: true
########################################################################################################################################## ##########################################################################################################################################
# Logging & Notifications # Logging & Notifications
# Upload the original go test output as an artifact for later review. # Upload the original go test output as an artifact for later review.
- name: Upload test log - name: Upload test log
@ -532,5 +536,5 @@ jobs:
if: failure() if: failure()
uses: ./.github/actions/teams-message uses: ./.github/actions/teams-message
with: with:
msg: "[FAILED] Sanity Tests" msg: "[CORSO FAILED] Sanity Tests"
teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }} teams_url: ${{ secrets.TEAMS_CORSO_CI_WEBHOOK_URL }}

View File

@ -6,12 +6,22 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased] (beta) ## [Unreleased] (beta)
### Fixed
- Handle the case where an email or event cannot be retrieved from Exchange due to an `ErrorCorruptData` error. Corso will skip over the item but report it in the backup summary.
- Emails attached within other emails are now correctly exported
- Gracefully handle email and post attachments without name when exporting to eml
- Use correct timezone for event start and end times in Exchange exports (helps fix issues in relative recurrence patterns)
- Fixed an issue causing exports dealing with calendar data to have high memory usage
## [v0.19.0] (beta) - 2024-02-06
### Added ### Added
- Events can now be exported from Exchange backups as .ics files. - Events can now be exported from Exchange backups as .ics files.
- Update repo init configuration to reduce the total number of GET requests sent - Update repo init configuration to reduce the total number of GET requests sent
to the object store when using corso. This affects repos that have many to the object store when using corso. This affects repos that have many
backups created in them per day the most. backups created in them per day the most.
- Feature Preview: Corso now supports backup, export & restore of SharePoint lists. Lists backup can be initiated using `corso backup create sharepoint --site <site-url> --data lists`.
- Group mailbox(aka conversations) backup and export support is now officially available. Group mailbox posts can be exported as `.eml` files.
### Fixed ### Fixed
- Retry transient 400 "invalidRequest" errors during onedrive & sharepoint backup. - Retry transient 400 "invalidRequest" errors during onedrive & sharepoint backup.
@ -19,10 +29,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Groups and Teams backups no longer fail when a resource has no display name. - Groups and Teams backups no longer fail when a resource has no display name.
- Contacts in-place restore failed if the restore destination was empty. - Contacts in-place restore failed if the restore destination was empty.
- Link shares with external users are now backed up and restored as expected - Link shares with external users are now backed up and restored as expected
- Ensure persistent repo config is populated on repo init if repo init failed partway through during the previous init attempt.
### Changed ### Changed
- When running `backup details` on an empty backup returns a more helpful error message. - When running `backup details` on an empty backup returns a more helpful error message.
- Backup List additionally shows the data category for each backup. - Backup List additionally shows the data category for each backup.
- Remove hidden `--succeed-if-exists` flag for repo init. Repo init will now succeed without error if run on an existing repo with the same passphrase.
### Known issues ### Known issues
- Backing up a group mailbox item may fail if it has a very large number of attachments (500+). - Backing up a group mailbox item may fail if it has a very large number of attachments (500+).
@ -30,6 +42,10 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Exchange in-place restore may restore items in well-known folders to different folders if the user has well-known folder names change based on locale and has updated the locale since the backup was created. - Exchange in-place restore may restore items in well-known folders to different folders if the user has well-known folder names change based on locale and has updated the locale since the backup was created.
- In-place Exchange contacts restore will merge items in folders named "Contacts" or "contacts" into the default folder. - In-place Exchange contacts restore will merge items in folders named "Contacts" or "contacts" into the default folder.
- External users with access through shared links will not receive these links as they are not sent via email during restore. - External users with access through shared links will not receive these links as they are not sent via email during restore.
- Graph API has limited support for certain column types such as `location`, `hyperlink/picture`, and `metadata`. Restoring SharePoint list items containing these columns will result in differences compared to the original items.
- SharePoint list item attachments are not available due to graph API limitations.
- Group mailbox restore is not supported due to limited Graph API support for creating mailbox items.
- Due to Graph API limitations, any group mailbox items present in subfolders other than Inbox aren't backed up.
## [v0.18.0] (beta) - 2024-01-02 ## [v0.18.0] (beta) - 2024-01-02
@ -486,7 +502,8 @@ this case, Corso will skip over the item but report this in the backup summary.
- Miscellaneous - Miscellaneous
- Optional usage statistics reporting ([RM-35](https://github.com/alcionai/corso-roadmap/issues/35)) - Optional usage statistics reporting ([RM-35](https://github.com/alcionai/corso-roadmap/issues/35))
[Unreleased]: https://github.com/alcionai/corso/compare/v0.18.0...HEAD [Unreleased]: https://github.com/alcionai/corso/compare/v0.19.0...HEAD
[v0.19.0]: https://github.com/alcionai/corso/compare/v0.18.0...v0.19.0
[v0.18.0]: https://github.com/alcionai/corso/compare/v0.17.0...v0.18.0 [v0.18.0]: https://github.com/alcionai/corso/compare/v0.17.0...v0.18.0
[v0.17.0]: https://github.com/alcionai/corso/compare/v0.16.0...v0.17.0 [v0.17.0]: https://github.com/alcionai/corso/compare/v0.16.0...v0.17.0
[v0.16.0]: https://github.com/alcionai/corso/compare/v0.15.0...v0.16.0 [v0.16.0]: https://github.com/alcionai/corso/compare/v0.15.0...v0.16.0

View File

@ -1,3 +1,6 @@
> [!NOTE]
> **The Corso project is no longer actively maintained and has been archived**.
<p align="center"> <p align="center">
<img src="https://github.com/alcionai/corso/blob/main/website/static/img/corso_logo.svg?raw=true" alt="Corso Logo" width="100" /> <img src="https://github.com/alcionai/corso/blob/main/website/static/img/corso_logo.svg?raw=true" alt="Corso Logo" width="100" />
</p> </p>

View File

@ -45,6 +45,7 @@ var serviceCommands = []func(cmd *cobra.Command) *cobra.Command{
addOneDriveCommands, addOneDriveCommands,
addSharePointCommands, addSharePointCommands,
addGroupsCommands, addGroupsCommands,
addTeamsChatsCommands,
} }
// AddCommands attaches all `corso backup * *` commands to the parent. // AddCommands attaches all `corso backup * *` commands to the parent.

View File

@ -18,6 +18,7 @@ import (
"github.com/alcionai/corso/src/internal/common/idname" "github.com/alcionai/corso/src/internal/common/idname"
"github.com/alcionai/corso/src/internal/operations" "github.com/alcionai/corso/src/internal/operations"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/config" "github.com/alcionai/corso/src/pkg/config"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
@ -39,7 +40,7 @@ var (
type NoBackupExchangeE2ESuite struct { type NoBackupExchangeE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestNoBackupExchangeE2ESuite(t *testing.T) { func TestNoBackupExchangeE2ESuite(t *testing.T) {
@ -54,7 +55,7 @@ func (suite *NoBackupExchangeE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.ExchangeService) suite.dpnd = prepM365Test(t, ctx, path.ExchangeService)
} }
@ -93,7 +94,7 @@ func (suite *NoBackupExchangeE2ESuite) TestExchangeBackupListCmd_noBackups() {
type BackupExchangeE2ESuite struct { type BackupExchangeE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestBackupExchangeE2ESuite(t *testing.T) { func TestBackupExchangeE2ESuite(t *testing.T) {
@ -108,7 +109,7 @@ func (suite *BackupExchangeE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.ExchangeService) suite.dpnd = prepM365Test(t, ctx, path.ExchangeService)
} }
@ -138,7 +139,7 @@ func runExchangeBackupCategoryTest(suite *BackupExchangeE2ESuite, category path.
cmd, ctx := buildExchangeBackupCmd( cmd, ctx := buildExchangeBackupCmd(
ctx, ctx,
suite.dpnd.configFilePath, suite.dpnd.configFilePath,
suite.its.user.ID, suite.m365.User.ID,
category.String(), category.String(),
&recorder) &recorder)
@ -149,8 +150,11 @@ func runExchangeBackupCategoryTest(suite *BackupExchangeE2ESuite, category path.
result := recorder.String() result := recorder.String()
t.Log("backup results", result) t.Log("backup results", result)
// as an offhand check: the result should contain the m365 user id // As an offhand check: the result should contain the m365 user's email.
assert.Contains(t, result, suite.its.user.ID) assert.Contains(
t,
strings.ToLower(result),
strings.ToLower(suite.m365.User.Provider.Name()))
} }
func (suite *BackupExchangeE2ESuite) TestExchangeBackupCmd_ServiceNotEnabled_email() { func (suite *BackupExchangeE2ESuite) TestExchangeBackupCmd_ServiceNotEnabled_email() {
@ -173,7 +177,7 @@ func runExchangeBackupServiceNotEnabledTest(suite *BackupExchangeE2ESuite, categ
cmd, ctx := buildExchangeBackupCmd( cmd, ctx := buildExchangeBackupCmd(
ctx, ctx,
suite.dpnd.configFilePath, suite.dpnd.configFilePath,
fmt.Sprintf("%s,%s", tconfig.UnlicensedM365UserID(suite.T()), suite.its.user.ID), fmt.Sprintf("%s,%s", tconfig.UnlicensedM365UserID(suite.T()), suite.m365.User.ID),
category.String(), category.String(),
&recorder) &recorder)
err := cmd.ExecuteContext(ctx) err := cmd.ExecuteContext(ctx)
@ -182,8 +186,11 @@ func runExchangeBackupServiceNotEnabledTest(suite *BackupExchangeE2ESuite, categ
result := recorder.String() result := recorder.String()
t.Log("backup results", result) t.Log("backup results", result)
// as an offhand check: the result should contain the m365 user id // As an offhand check: the result should contain the m365 user's email.
assert.Contains(t, result, suite.its.user.ID) assert.Contains(
t,
strings.ToLower(result),
strings.ToLower(suite.m365.User.Provider.Name()))
} }
func (suite *BackupExchangeE2ESuite) TestExchangeBackupCmd_userNotFound_email() { func (suite *BackupExchangeE2ESuite) TestExchangeBackupCmd_userNotFound_email() {
@ -242,7 +249,7 @@ func (suite *BackupExchangeE2ESuite) TestBackupCreateExchange_badAzureClientIDFl
cmd := cliTD.StubRootCmd( cmd := cliTD.StubRootCmd(
"backup", "create", "exchange", "backup", "create", "exchange",
"--user", suite.its.user.ID, "--user", suite.m365.User.ID,
"--azure-client-id", "invalid-value") "--azure-client-id", "invalid-value")
cli.BuildCommandTree(cmd) cli.BuildCommandTree(cmd)
@ -266,7 +273,7 @@ func (suite *BackupExchangeE2ESuite) TestBackupCreateExchange_fromConfigFile() {
cmd := cliTD.StubRootCmd( cmd := cliTD.StubRootCmd(
"backup", "create", "exchange", "backup", "create", "exchange",
"--user", suite.its.user.ID, "--user", suite.m365.User.ID,
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath) "--"+flags.ConfigFileFN, suite.dpnd.configFilePath)
cli.BuildCommandTree(cmd) cli.BuildCommandTree(cmd)
@ -281,8 +288,11 @@ func (suite *BackupExchangeE2ESuite) TestBackupCreateExchange_fromConfigFile() {
result := suite.dpnd.recorder.String() result := suite.dpnd.recorder.String()
t.Log("backup results", result) t.Log("backup results", result)
// as an offhand check: the result should contain the m365 user id // As an offhand check: the result should contain the m365 user's email.
assert.Contains(t, result, suite.its.user.ID) assert.Contains(
t,
strings.ToLower(result),
strings.ToLower(suite.m365.User.Provider.Name()))
} }
// AWS flags // AWS flags
@ -296,7 +306,7 @@ func (suite *BackupExchangeE2ESuite) TestBackupCreateExchange_badAWSFlags() {
cmd := cliTD.StubRootCmd( cmd := cliTD.StubRootCmd(
"backup", "create", "exchange", "backup", "create", "exchange",
"--user", suite.its.user.ID, "--user", suite.m365.User.ID,
"--aws-access-key", "invalid-value", "--aws-access-key", "invalid-value",
"--aws-secret-access-key", "some-invalid-value") "--aws-secret-access-key", "some-invalid-value")
cli.BuildCommandTree(cmd) cli.BuildCommandTree(cmd)
@ -319,7 +329,7 @@ type PreparedBackupExchangeE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
backupOps map[path.CategoryType]string backupOps map[path.CategoryType]string
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestPreparedBackupExchangeE2ESuite(t *testing.T) { func TestPreparedBackupExchangeE2ESuite(t *testing.T) {
@ -336,13 +346,13 @@ func (suite *PreparedBackupExchangeE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.ExchangeService) suite.dpnd = prepM365Test(t, ctx, path.ExchangeService)
suite.backupOps = make(map[path.CategoryType]string) suite.backupOps = make(map[path.CategoryType]string)
var ( var (
users = []string{suite.its.user.ID} users = []string{suite.m365.User.ID}
ins = idname.NewCache(map[string]string{suite.its.user.ID: suite.its.user.ID}) ins = idname.NewCache(map[string]string{suite.m365.User.ID: suite.m365.User.ID})
) )
for _, set := range []path.CategoryType{email, contacts, events} { for _, set := range []path.CategoryType{email, contacts, events} {

View File

@ -35,9 +35,12 @@ const (
groupsServiceCommandCreateExamples = `# Backup all Groups and Teams data for the Marketing group groupsServiceCommandCreateExamples = `# Backup all Groups and Teams data for the Marketing group
corso backup create groups --group Marketing corso backup create groups --group Marketing
# Backup only Teams conversations messages # Backup only Teams channel messages
corso backup create groups --group Marketing --data messages corso backup create groups --group Marketing --data messages
# Backup only group mailbox posts
corso backup create groups --group Marketing --data conversations
# Backup all Groups and Teams data for all groups # Backup all Groups and Teams data for all groups
corso backup create groups --group '*'` corso backup create groups --group '*'`
@ -50,7 +53,10 @@ corso backup details groups --backup 1234abcd-12ab-cd34-56de-1234abcd
# Explore Marketing messages posted after the start of 2022 # Explore Marketing messages posted after the start of 2022
corso backup details groups --backup 1234abcd-12ab-cd34-56de-1234abcd \ corso backup details groups --backup 1234abcd-12ab-cd34-56de-1234abcd \
--last-message-reply-after 2022-01-01T00:00:00` --last-message-reply-after 2022-01-01T00:00:00
# Explore group mailbox posts with conversation subject "hello world"
corso backup details groups --backup 1234abcd-12ab-cd34-56de-1234abcd --conversation "hello world"`
) )
// called by backup.go to map subcommands to provider-specific handling. // called by backup.go to map subcommands to provider-specific handling.
@ -310,7 +316,7 @@ func groupsBackupCreateSelectors(
group, cats []string, group, cats []string,
) *selectors.GroupsBackup { ) *selectors.GroupsBackup {
if filters.PathContains(group).Compare(flags.Wildcard) { if filters.PathContains(group).Compare(flags.Wildcard) {
return includeAllGroupWithCategories(ins, cats) return includeAllGroupsWithCategories(ins, cats)
} }
sel := selectors.NewGroupsBackup(slices.Clone(group)) sel := selectors.NewGroupsBackup(slices.Clone(group))
@ -318,6 +324,6 @@ func groupsBackupCreateSelectors(
return utils.AddGroupsCategories(sel, cats) return utils.AddGroupsCategories(sel, cats)
} }
func includeAllGroupWithCategories(ins idname.Cacher, categories []string) *selectors.GroupsBackup { func includeAllGroupsWithCategories(ins idname.Cacher, categories []string) *selectors.GroupsBackup {
return utils.AddGroupsCategories(selectors.NewGroupsBackup(ins.IDs()), categories) return utils.AddGroupsCategories(selectors.NewGroupsBackup(ins.IDs()), categories)
} }

View File

@ -20,6 +20,7 @@ import (
"github.com/alcionai/corso/src/internal/common/idname" "github.com/alcionai/corso/src/internal/common/idname"
"github.com/alcionai/corso/src/internal/operations" "github.com/alcionai/corso/src/internal/operations"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/config" "github.com/alcionai/corso/src/pkg/config"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
@ -35,7 +36,7 @@ import (
type NoBackupGroupsE2ESuite struct { type NoBackupGroupsE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestNoBackupGroupsE2ESuite(t *testing.T) { func TestNoBackupGroupsE2ESuite(t *testing.T) {
@ -50,7 +51,7 @@ func (suite *NoBackupGroupsE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.GroupsService) suite.dpnd = prepM365Test(t, ctx, path.GroupsService)
} }
@ -89,7 +90,7 @@ func (suite *NoBackupGroupsE2ESuite) TestGroupsBackupListCmd_noBackups() {
type BackupGroupsE2ESuite struct { type BackupGroupsE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestBackupGroupsE2ESuite(t *testing.T) { func TestBackupGroupsE2ESuite(t *testing.T) {
@ -104,7 +105,7 @@ func (suite *BackupGroupsE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.GroupsService) suite.dpnd = prepM365Test(t, ctx, path.GroupsService)
} }
@ -113,6 +114,8 @@ func (suite *BackupGroupsE2ESuite) TestGroupsBackupCmd_channelMessages() {
} }
func (suite *BackupGroupsE2ESuite) TestGroupsBackupCmd_conversations() { func (suite *BackupGroupsE2ESuite) TestGroupsBackupCmd_conversations() {
// skip
suite.T().Skip("CorsoCITeam group mailbox backup is broken")
runGroupsBackupCategoryTest(suite, flags.DataConversations) runGroupsBackupCategoryTest(suite, flags.DataConversations)
} }
@ -134,7 +137,7 @@ func runGroupsBackupCategoryTest(suite *BackupGroupsE2ESuite, category string) {
cmd, ctx := buildGroupsBackupCmd( cmd, ctx := buildGroupsBackupCmd(
ctx, ctx,
suite.dpnd.configFilePath, suite.dpnd.configFilePath,
suite.its.group.ID, suite.m365.Group.ID,
category, category,
&recorder) &recorder)
@ -202,7 +205,7 @@ func (suite *BackupGroupsE2ESuite) TestBackupCreateGroups_badAzureClientIDFlag()
cmd := cliTD.StubRootCmd( cmd := cliTD.StubRootCmd(
"backup", "create", "groups", "backup", "create", "groups",
"--group", suite.its.group.ID, "--group", suite.m365.Group.ID,
"--azure-client-id", "invalid-value") "--azure-client-id", "invalid-value")
cli.BuildCommandTree(cmd) cli.BuildCommandTree(cmd)
@ -216,6 +219,9 @@ func (suite *BackupGroupsE2ESuite) TestBackupCreateGroups_badAzureClientIDFlag()
} }
func (suite *BackupGroupsE2ESuite) TestBackupCreateGroups_fromConfigFile() { func (suite *BackupGroupsE2ESuite) TestBackupCreateGroups_fromConfigFile() {
// Skip
suite.T().Skip("CorsoCITeam group mailbox backup is broken")
t := suite.T() t := suite.T()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr) ctx = config.SetViper(ctx, suite.dpnd.vpr)
@ -226,7 +232,7 @@ func (suite *BackupGroupsE2ESuite) TestBackupCreateGroups_fromConfigFile() {
cmd := cliTD.StubRootCmd( cmd := cliTD.StubRootCmd(
"backup", "create", "groups", "backup", "create", "groups",
"--group", suite.its.group.ID, "--group", suite.m365.Group.ID,
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath) "--"+flags.ConfigFileFN, suite.dpnd.configFilePath)
cli.BuildCommandTree(cmd) cli.BuildCommandTree(cmd)
@ -250,7 +256,7 @@ func (suite *BackupGroupsE2ESuite) TestBackupCreateGroups_badAWSFlags() {
cmd := cliTD.StubRootCmd( cmd := cliTD.StubRootCmd(
"backup", "create", "groups", "backup", "create", "groups",
"--group", suite.its.group.ID, "--group", suite.m365.Group.ID,
"--aws-access-key", "invalid-value", "--aws-access-key", "invalid-value",
"--aws-secret-access-key", "some-invalid-value") "--aws-secret-access-key", "some-invalid-value")
cli.BuildCommandTree(cmd) cli.BuildCommandTree(cmd)
@ -273,7 +279,7 @@ type PreparedBackupGroupsE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
backupOps map[path.CategoryType]string backupOps map[path.CategoryType]string
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestPreparedBackupGroupsE2ESuite(t *testing.T) { func TestPreparedBackupGroupsE2ESuite(t *testing.T) {
@ -290,16 +296,19 @@ func (suite *PreparedBackupGroupsE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.GroupsService) suite.dpnd = prepM365Test(t, ctx, path.GroupsService)
suite.backupOps = make(map[path.CategoryType]string) suite.backupOps = make(map[path.CategoryType]string)
var ( var (
groups = []string{suite.its.group.ID} groups = []string{suite.m365.Group.ID}
ins = idname.NewCache(map[string]string{suite.its.group.ID: suite.its.group.ID}) ins = idname.NewCache(map[string]string{suite.m365.Group.ID: suite.m365.Group.ID})
cats = []path.CategoryType{ cats = []path.CategoryType{
path.ChannelMessagesCategory, path.ChannelMessagesCategory,
path.ConversationPostsCategory, // TODO(pandeyabs): CorsoCITeam group mailbox backup is currently broken because of invalid
// odata.NextLink which causes an infinite loop during paging. Disabling conversations tests while
// we go fix the group mailbox.
// path.ConversationPostsCategory,
path.LibrariesCategory, path.LibrariesCategory,
} }
) )
@ -453,6 +462,8 @@ func (suite *PreparedBackupGroupsE2ESuite) TestGroupsDetailsCmd_channelMessages(
} }
func (suite *PreparedBackupGroupsE2ESuite) TestGroupsDetailsCmd_conversations() { func (suite *PreparedBackupGroupsE2ESuite) TestGroupsDetailsCmd_conversations() {
// skip
suite.T().Skip("CorsoCITeam group mailbox backup is broken")
runGroupsDetailsCmdTest(suite, path.ConversationPostsCategory) runGroupsDetailsCmdTest(suite, path.ConversationPostsCategory)
} }

View File

@ -14,141 +14,16 @@ import (
"github.com/alcionai/corso/src/cli/flags" "github.com/alcionai/corso/src/cli/flags"
"github.com/alcionai/corso/src/cli/print" "github.com/alcionai/corso/src/cli/print"
cliTD "github.com/alcionai/corso/src/cli/testdata" cliTD "github.com/alcionai/corso/src/cli/testdata"
"github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/account" "github.com/alcionai/corso/src/pkg/account"
"github.com/alcionai/corso/src/pkg/config" "github.com/alcionai/corso/src/pkg/config"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/count"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/repository" "github.com/alcionai/corso/src/pkg/repository"
"github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph"
"github.com/alcionai/corso/src/pkg/storage" "github.com/alcionai/corso/src/pkg/storage"
"github.com/alcionai/corso/src/pkg/storage/testdata" "github.com/alcionai/corso/src/pkg/storage/testdata"
) )
// ---------------------------------------------------------------------------
// Gockable client
// ---------------------------------------------------------------------------
// GockClient produces a new exchange api client that can be
// mocked using gock.
func gockClient(creds account.M365Config, counter *count.Bus) (api.Client, error) {
s, err := graph.NewGockService(creds, counter)
if err != nil {
return api.Client{}, err
}
li, err := graph.NewGockService(creds, counter, graph.NoTimeout())
if err != nil {
return api.Client{}, err
}
return api.Client{
Credentials: creds,
Stable: s,
LargeItem: li,
}, nil
}
// ---------------------------------------------------------------------------
// Suite Setup
// ---------------------------------------------------------------------------
type ids struct {
ID string
DriveID string
DriveRootFolderID string
}
type intgTesterSetup struct {
acct account.Account
ac api.Client
gockAC api.Client
user ids
site ids
group ids
team ids
}
func newIntegrationTesterSetup(t *testing.T) intgTesterSetup {
its := intgTesterSetup{}
ctx, flush := tester.NewContext(t)
defer flush()
graph.InitializeConcurrencyLimiter(ctx, true, 4)
its.acct = tconfig.NewM365Account(t)
creds, err := its.acct.M365Config()
require.NoError(t, err, clues.ToCore(err))
its.ac, err = api.NewClient(
creds,
control.DefaultOptions(),
count.New())
require.NoError(t, err, clues.ToCore(err))
its.gockAC, err = gockClient(creds, count.New())
require.NoError(t, err, clues.ToCore(err))
// user drive
uids := ids{}
uids.ID = tconfig.M365UserID(t)
userDrive, err := its.ac.Users().GetDefaultDrive(ctx, uids.ID)
require.NoError(t, err, clues.ToCore(err))
uids.DriveID = ptr.Val(userDrive.GetId())
userDriveRootFolder, err := its.ac.Drives().GetRootFolder(ctx, uids.DriveID)
require.NoError(t, err, clues.ToCore(err))
uids.DriveRootFolderID = ptr.Val(userDriveRootFolder.GetId())
its.user = uids
// site
sids := ids{}
sids.ID = tconfig.M365SiteID(t)
siteDrive, err := its.ac.Sites().GetDefaultDrive(ctx, sids.ID)
require.NoError(t, err, clues.ToCore(err))
sids.DriveID = ptr.Val(siteDrive.GetId())
siteDriveRootFolder, err := its.ac.Drives().GetRootFolder(ctx, sids.DriveID)
require.NoError(t, err, clues.ToCore(err))
sids.DriveRootFolderID = ptr.Val(siteDriveRootFolder.GetId())
its.site = sids
// group
gids := ids{}
// use of the TeamID is intentional here, so that we are assured
// the group has full usage of the teams api.
gids.ID = tconfig.M365TeamID(t)
its.group = gids
// team
tids := ids{}
tids.ID = tconfig.M365TeamID(t)
its.team = tids
return its
}
type dependencies struct { type dependencies struct {
st storage.Storage st storage.Storage
repo repository.Repositoryer repo repository.Repositoryer

View File

@ -37,7 +37,11 @@ corso backup create sharepoint --site https://example.com/hr
corso backup create sharepoint --site https://example.com/hr,https://example.com/team corso backup create sharepoint --site https://example.com/hr,https://example.com/team
# Backup all SharePoint data for all Sites # Backup all SharePoint data for all Sites
corso backup create sharepoint --site '*'` corso backup create sharepoint --site '*'
# Backup all SharePoint list data for a Site
corso backup create sharepoint --site https://example.com/hr --data lists
`
sharePointServiceCommandDeleteExamples = `# Delete SharePoint backup with ID 1234abcd-12ab-cd34-56de-1234abcd \ sharePointServiceCommandDeleteExamples = `# Delete SharePoint backup with ID 1234abcd-12ab-cd34-56de-1234abcd \
and 1234abcd-12ab-cd34-56de-1234abce and 1234abcd-12ab-cd34-56de-1234abce
@ -57,7 +61,26 @@ corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
# Explore all files within the document library "Work Documents" # Explore all files within the document library "Work Documents"
corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \ corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--library "Work Documents" --library "Work Documents"
`
# Explore lists by their name(s)
corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list "list-name-1,list-name-2"
# Explore lists created after a given time
corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-created-after 2024-01-01T12:23:34
# Explore lists created before a given time
corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-created-before 2024-01-01T12:23:34
# Explore lists modified before a given time
corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-modified-before 2024-01-01T12:23:34
# Explore lists modified after a given time
corso backup details sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-modified-after 2024-01-01T12:23:34`
) )
// called by backup.go to map subcommands to provider-specific handling. // called by backup.go to map subcommands to provider-specific handling.
@ -73,6 +96,8 @@ func addSharePointCommands(cmd *cobra.Command) *cobra.Command {
flags.AddSiteFlag(c, true) flags.AddSiteFlag(c, true)
flags.AddSiteIDFlag(c, true) flags.AddSiteIDFlag(c, true)
// [TODO](hitesh) to add lists flag to invoke backup for lists
// when explicit invoke is not required anymore
flags.AddDataFlag(c, []string{flags.DataLibraries}, true) flags.AddDataFlag(c, []string{flags.DataLibraries}, true)
flags.AddGenericBackupFlags(c) flags.AddGenericBackupFlags(c)

View File

@ -20,6 +20,7 @@ import (
"github.com/alcionai/corso/src/internal/common/idname" "github.com/alcionai/corso/src/internal/common/idname"
"github.com/alcionai/corso/src/internal/operations" "github.com/alcionai/corso/src/internal/operations"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/backup/details" "github.com/alcionai/corso/src/pkg/backup/details"
"github.com/alcionai/corso/src/pkg/config" "github.com/alcionai/corso/src/pkg/config"
@ -89,7 +90,7 @@ func (suite *NoBackupSharePointE2ESuite) TestSharePointBackupListCmd_empty() {
type BackupSharepointE2ESuite struct { type BackupSharepointE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestBackupSharepointE2ESuite(t *testing.T) { func TestBackupSharepointE2ESuite(t *testing.T) {
@ -104,7 +105,7 @@ func (suite *BackupSharepointE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.SharePointService) suite.dpnd = prepM365Test(t, ctx, path.SharePointService)
} }
@ -128,7 +129,7 @@ func runSharepointBackupCategoryTest(suite *BackupSharepointE2ESuite, category s
cmd, ctx := buildSharepointBackupCmd( cmd, ctx := buildSharepointBackupCmd(
ctx, ctx,
suite.dpnd.configFilePath, suite.dpnd.configFilePath,
suite.its.site.ID, suite.m365.Site.ID,
category, category,
&recorder) &recorder)
@ -187,7 +188,7 @@ type PreparedBackupSharepointE2ESuite struct {
tester.Suite tester.Suite
dpnd dependencies dpnd dependencies
backupOps map[path.CategoryType]string backupOps map[path.CategoryType]string
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestPreparedBackupSharepointE2ESuite(t *testing.T) { func TestPreparedBackupSharepointE2ESuite(t *testing.T) {
@ -204,13 +205,13 @@ func (suite *PreparedBackupSharepointE2ESuite) SetupSuite() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
suite.its = newIntegrationTesterSetup(t) suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.SharePointService) suite.dpnd = prepM365Test(t, ctx, path.SharePointService)
suite.backupOps = make(map[path.CategoryType]string) suite.backupOps = make(map[path.CategoryType]string)
var ( var (
sites = []string{suite.its.site.ID} sites = []string{suite.m365.Site.ID}
ins = idname.NewCache(map[string]string{suite.its.site.ID: suite.its.site.ID}) ins = idname.NewCache(map[string]string{suite.m365.Site.ID: suite.m365.Site.ID})
cats = []path.CategoryType{ cats = []path.CategoryType{
path.ListsCategory, path.ListsCategory,
} }

View File

@ -0,0 +1,305 @@
package backup
import (
"context"
"fmt"
"github.com/alcionai/clues"
"github.com/spf13/cobra"
"golang.org/x/exp/slices"
"github.com/alcionai/corso/src/cli/flags"
. "github.com/alcionai/corso/src/cli/print"
"github.com/alcionai/corso/src/cli/utils"
"github.com/alcionai/corso/src/internal/common/idname"
"github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/filters"
"github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/selectors"
"github.com/alcionai/corso/src/pkg/services/m365"
)
// ------------------------------------------------------------------------------------------------
// setup and globals
// ------------------------------------------------------------------------------------------------
const (
teamschatsServiceCommand = "chats"
teamschatsServiceCommandCreateUseSuffix = "--user <userEmail> | '" + flags.Wildcard + "'"
teamschatsServiceCommandDeleteUseSuffix = "--backups <backupId>"
teamschatsServiceCommandDetailsUseSuffix = "--backup <backupId>"
)
const (
teamschatsServiceCommandCreateExamples = `# Backup all chats with bob@company.hr
corso backup create chats --user bob@company.hr
# Backup all chats for all users
corso backup create chats --user '*'`
teamschatsServiceCommandDeleteExamples = `# Delete chats backup with ID 1234abcd-12ab-cd34-56de-1234abcd \
and 1234abcd-12ab-cd34-56de-1234abce
corso backup delete chats --backups 1234abcd-12ab-cd34-56de-1234abcd,1234abcd-12ab-cd34-56de-1234abce`
teamschatsServiceCommandDetailsExamples = `# Explore chats in Bob's latest backup (1234abcd...)
corso backup details chats --backup 1234abcd-12ab-cd34-56de-1234abcd`
)
// called by backup.go to map subcommands to provider-specific handling.
func addTeamsChatsCommands(cmd *cobra.Command) *cobra.Command {
var c *cobra.Command
switch cmd.Use {
case createCommand:
c, _ = utils.AddCommand(cmd, teamschatsCreateCmd(), utils.MarkPreReleaseCommand())
c.Use = c.Use + " " + teamschatsServiceCommandCreateUseSuffix
c.Example = teamschatsServiceCommandCreateExamples
// Flags addition ordering should follow the order we want them to appear in help and docs:
flags.AddUserFlag(c)
flags.AddDataFlag(c, []string{flags.DataChats}, false)
flags.AddGenericBackupFlags(c)
case listCommand:
c, _ = utils.AddCommand(cmd, teamschatsListCmd(), utils.MarkPreReleaseCommand())
flags.AddBackupIDFlag(c, false)
flags.AddAllBackupListFlags(c)
case detailsCommand:
c, _ = utils.AddCommand(cmd, teamschatsDetailsCmd(), utils.MarkPreReleaseCommand())
c.Use = c.Use + " " + teamschatsServiceCommandDetailsUseSuffix
c.Example = teamschatsServiceCommandDetailsExamples
flags.AddSkipReduceFlag(c)
// Flags addition ordering should follow the order we want them to appear in help and docs:
// More generic (ex: --user) and more frequently used flags take precedence.
flags.AddBackupIDFlag(c, true)
flags.AddTeamsChatsDetailsAndRestoreFlags(c)
case deleteCommand:
c, _ = utils.AddCommand(cmd, teamschatsDeleteCmd(), utils.MarkPreReleaseCommand())
c.Use = c.Use + " " + teamschatsServiceCommandDeleteUseSuffix
c.Example = teamschatsServiceCommandDeleteExamples
flags.AddMultipleBackupIDsFlag(c, false)
flags.AddBackupIDFlag(c, false)
}
return c
}
// ------------------------------------------------------------------------------------------------
// backup create
// ------------------------------------------------------------------------------------------------
// `corso backup create chats [<flag>...]`
func teamschatsCreateCmd() *cobra.Command {
return &cobra.Command{
Use: teamschatsServiceCommand,
Aliases: []string{teamsServiceCommand},
Short: "Backup M365 Chats data",
RunE: createTeamsChatsCmd,
Args: cobra.NoArgs,
}
}
// processes a teamschats backup.
func createTeamsChatsCmd(cmd *cobra.Command, args []string) error {
ctx := cmd.Context()
if utils.HasNoFlagsAndShownHelp(cmd) {
return nil
}
if flags.RunModeFV == flags.RunModeFlagTest {
return nil
}
if err := validateTeamsChatsBackupCreateFlags(flags.UserFV, flags.CategoryDataFV); err != nil {
return err
}
r, acct, err := utils.AccountConnectAndWriteRepoConfig(
ctx,
cmd,
path.TeamsChatsService)
if err != nil {
return Only(ctx, err)
}
defer utils.CloseRepo(ctx, r)
// TODO: log/print recoverable errors
errs := fault.New(false)
svcCli, err := m365.NewM365Client(ctx, *acct)
if err != nil {
return Only(ctx, clues.Stack(err))
}
ins, err := svcCli.AC.Users().GetAllIDsAndNames(ctx, errs)
if err != nil {
return Only(ctx, clues.Wrap(err, "Failed to retrieve M365 teamschats"))
}
sel := teamschatsBackupCreateSelectors(ctx, ins, flags.UserFV, flags.CategoryDataFV)
selectorSet := []selectors.Selector{}
for _, discSel := range sel.SplitByResourceOwner(ins.IDs()) {
selectorSet = append(selectorSet, discSel.Selector)
}
return genericCreateCommand(
ctx,
r,
"Chats",
selectorSet,
ins)
}
// ------------------------------------------------------------------------------------------------
// backup list
// ------------------------------------------------------------------------------------------------
// `corso backup list teamschats [<flag>...]`
func teamschatsListCmd() *cobra.Command {
return &cobra.Command{
Use: teamschatsServiceCommand,
Short: "List the history of M365 Chats backups",
RunE: listTeamsChatsCmd,
Args: cobra.NoArgs,
}
}
// lists the history of backup operations
func listTeamsChatsCmd(cmd *cobra.Command, args []string) error {
return genericListCommand(cmd, flags.BackupIDFV, path.TeamsChatsService, args)
}
// ------------------------------------------------------------------------------------------------
// backup details
// ------------------------------------------------------------------------------------------------
// `corso backup details teamschats [<flag>...]`
func teamschatsDetailsCmd() *cobra.Command {
return &cobra.Command{
Use: teamschatsServiceCommand,
Short: "Shows the details of a M365 Chats backup",
RunE: detailsTeamsChatsCmd,
Args: cobra.NoArgs,
}
}
// processes a teamschats backup.
func detailsTeamsChatsCmd(cmd *cobra.Command, args []string) error {
if utils.HasNoFlagsAndShownHelp(cmd) {
return nil
}
if flags.RunModeFV == flags.RunModeFlagTest {
return nil
}
return runDetailsTeamsChatsCmd(cmd)
}
func runDetailsTeamsChatsCmd(cmd *cobra.Command) error {
ctx := cmd.Context()
opts := utils.MakeTeamsChatsOpts(cmd)
sel := utils.IncludeTeamsChatsRestoreDataSelectors(ctx, opts)
sel.Configure(selectors.Config{OnlyMatchItemNames: true})
utils.FilterTeamsChatsRestoreInfoSelectors(sel, opts)
ds, err := genericDetailsCommand(cmd, flags.BackupIDFV, sel.Selector)
if err != nil {
return Only(ctx, err)
}
if len(ds.Entries) > 0 {
ds.PrintEntries(ctx)
} else {
Info(ctx, selectors.ErrorNoMatchingItems)
}
return nil
}
// ------------------------------------------------------------------------------------------------
// backup delete
// ------------------------------------------------------------------------------------------------
// `corso backup delete teamschats [<flag>...]`
func teamschatsDeleteCmd() *cobra.Command {
return &cobra.Command{
Use: teamschatsServiceCommand,
Short: "Delete backed-up M365 Chats data",
RunE: deleteTeamsChatsCmd,
Args: cobra.NoArgs,
}
}
// deletes an teamschats backup.
func deleteTeamsChatsCmd(cmd *cobra.Command, args []string) error {
backupIDValue := []string{}
if len(flags.BackupIDsFV) > 0 {
backupIDValue = flags.BackupIDsFV
} else if len(flags.BackupIDFV) > 0 {
backupIDValue = append(backupIDValue, flags.BackupIDFV)
} else {
return clues.New("either --backup or --backups flag is required")
}
return genericDeleteCommand(cmd, path.TeamsChatsService, "TeamsChats", backupIDValue, args)
}
// ---------------------------------------------------------------------------
// helpers
// ---------------------------------------------------------------------------
func validateTeamsChatsBackupCreateFlags(teamschats, cats []string) error {
if len(teamschats) == 0 {
return clues.New(
"requires one or more --" +
flags.UserFN + " ids, or the wildcard --" +
flags.UserFN + " *")
}
msg := fmt.Sprintf(
" is an unrecognized data type; only %s is supported",
flags.DataChats)
allowedCats := utils.TeamsChatsAllowedCategories()
for _, d := range cats {
if _, ok := allowedCats[d]; !ok {
return clues.New(d + msg)
}
}
return nil
}
func teamschatsBackupCreateSelectors(
ctx context.Context,
ins idname.Cacher,
users, cats []string,
) *selectors.TeamsChatsBackup {
if filters.PathContains(users).Compare(flags.Wildcard) {
return includeAllTeamsChatsWithCategories(ins, cats)
}
sel := selectors.NewTeamsChatsBackup(slices.Clone(users))
return utils.AddTeamsChatsCategories(sel, cats)
}
func includeAllTeamsChatsWithCategories(ins idname.Cacher, categories []string) *selectors.TeamsChatsBackup {
return utils.AddTeamsChatsCategories(selectors.NewTeamsChatsBackup(ins.IDs()), categories)
}

View File

@ -0,0 +1,636 @@
package backup_test
import (
"context"
"fmt"
"strings"
"testing"
"github.com/alcionai/clues"
"github.com/google/uuid"
"github.com/spf13/cobra"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/stretchr/testify/suite"
"github.com/alcionai/corso/src/cli"
"github.com/alcionai/corso/src/cli/flags"
"github.com/alcionai/corso/src/cli/print"
cliTD "github.com/alcionai/corso/src/cli/testdata"
"github.com/alcionai/corso/src/internal/common/idname"
"github.com/alcionai/corso/src/internal/operations"
"github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/config"
"github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/selectors"
selTD "github.com/alcionai/corso/src/pkg/selectors/testdata"
storeTD "github.com/alcionai/corso/src/pkg/storage/testdata"
)
// ---------------------------------------------------------------------------
// tests that require no existing backups
// ---------------------------------------------------------------------------
type NoBackupTeamsChatsE2ESuite struct {
tester.Suite
dpnd dependencies
m365 its.M365IntgTestSetup
}
func TestNoBackupTeamsChatsE2ESuite(t *testing.T) {
suite.Run(t, &BackupTeamsChatsE2ESuite{Suite: tester.NewE2ESuite(
t,
[][]string{storeTD.AWSStorageCredEnvs, tconfig.M365AcctCredEnvs})})
}
func (suite *NoBackupTeamsChatsE2ESuite) SetupSuite() {
t := suite.T()
t.Skip("not fully implemented")
ctx, flush := tester.NewContext(t)
defer flush()
suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.TeamsChatsService)
}
func (suite *NoBackupTeamsChatsE2ESuite) TestTeamsChatsBackupListCmd_noBackups() {
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
suite.dpnd.recorder.Reset()
cmd := cliTD.StubRootCmd(
"backup", "list", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath)
cli.BuildCommandTree(cmd)
cmd.SetErr(&suite.dpnd.recorder)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
result := suite.dpnd.recorder.String()
// as an offhand check: the result should contain the m365 teamschat id
assert.True(t, strings.HasSuffix(result, "No backups available\n"))
}
// ---------------------------------------------------------------------------
// tests with no prior backup
// ---------------------------------------------------------------------------
type BackupTeamsChatsE2ESuite struct {
tester.Suite
dpnd dependencies
m365 its.M365IntgTestSetup
}
func TestBackupTeamsChatsE2ESuite(t *testing.T) {
suite.Run(t, &BackupTeamsChatsE2ESuite{Suite: tester.NewE2ESuite(
t,
[][]string{storeTD.AWSStorageCredEnvs, tconfig.M365AcctCredEnvs})})
}
func (suite *BackupTeamsChatsE2ESuite) SetupSuite() {
t := suite.T()
t.Skip("not fully implemented")
ctx, flush := tester.NewContext(t)
defer flush()
suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.TeamsChatsService)
}
func (suite *BackupTeamsChatsE2ESuite) TestTeamsChatsBackupCmd_chats() {
runTeamsChatsBackupCategoryTest(suite, flags.DataChats)
}
func runTeamsChatsBackupCategoryTest(suite *BackupTeamsChatsE2ESuite, category string) {
recorder := strings.Builder{}
recorder.Reset()
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd, ctx := buildTeamsChatsBackupCmd(
ctx,
suite.dpnd.configFilePath,
suite.m365.User.ID,
category,
&recorder)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
result := recorder.String()
t.Log("backup results", result)
}
func (suite *BackupTeamsChatsE2ESuite) TestTeamsChatsBackupCmd_teamschatNotFound_chats() {
runTeamsChatsBackupTeamsChatNotFoundTest(suite, flags.DataChats)
}
func runTeamsChatsBackupTeamsChatNotFoundTest(suite *BackupTeamsChatsE2ESuite, category string) {
recorder := strings.Builder{}
recorder.Reset()
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd, ctx := buildTeamsChatsBackupCmd(
ctx,
suite.dpnd.configFilePath,
"foo@not-there.com",
category,
&recorder)
// run the command
err := cmd.ExecuteContext(ctx)
require.Error(t, err, clues.ToCore(err))
assert.Contains(
t,
err.Error(),
"not found",
"error missing user not found")
assert.NotContains(t, err.Error(), "runtime error", "panic happened")
t.Logf("backup error message: %s", err.Error())
result := recorder.String()
t.Log("backup results", result)
}
func (suite *BackupTeamsChatsE2ESuite) TestBackupCreateTeamsChats_badAzureClientIDFlag() {
t := suite.T()
ctx, flush := tester.NewContext(t)
defer flush()
suite.dpnd.recorder.Reset()
cmd := cliTD.StubRootCmd(
"backup", "create", "chats",
"--teamschat", suite.m365.User.ID,
"--azure-client-id", "invalid-value")
cli.BuildCommandTree(cmd)
cmd.SetErr(&suite.dpnd.recorder)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.Error(t, err, clues.ToCore(err))
}
func (suite *BackupTeamsChatsE2ESuite) TestBackupCreateTeamsChats_fromConfigFile() {
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
suite.dpnd.recorder.Reset()
cmd := cliTD.StubRootCmd(
"backup", "create", "chats",
"--teamschat", suite.m365.User.ID,
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath)
cli.BuildCommandTree(cmd)
cmd.SetOut(&suite.dpnd.recorder)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
}
// AWS flags
func (suite *BackupTeamsChatsE2ESuite) TestBackupCreateTeamsChats_badAWSFlags() {
t := suite.T()
ctx, flush := tester.NewContext(t)
defer flush()
suite.dpnd.recorder.Reset()
cmd := cliTD.StubRootCmd(
"backup", "create", "chats",
"--teamschat", suite.m365.User.ID,
"--aws-access-key", "invalid-value",
"--aws-secret-access-key", "some-invalid-value")
cli.BuildCommandTree(cmd)
cmd.SetOut(&suite.dpnd.recorder)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
// since invalid aws creds are explicitly set, should see a failure
require.Error(t, err, clues.ToCore(err))
}
// ---------------------------------------------------------------------------
// tests prepared with a previous backup
// ---------------------------------------------------------------------------
type PreparedBackupTeamsChatsE2ESuite struct {
tester.Suite
dpnd dependencies
backupOps map[path.CategoryType]string
m365 its.M365IntgTestSetup
}
func TestPreparedBackupTeamsChatsE2ESuite(t *testing.T) {
suite.Run(t, &PreparedBackupTeamsChatsE2ESuite{
Suite: tester.NewE2ESuite(
t,
[][]string{storeTD.AWSStorageCredEnvs, tconfig.M365AcctCredEnvs}),
})
}
func (suite *PreparedBackupTeamsChatsE2ESuite) SetupSuite() {
t := suite.T()
t.Skip("not fully implemented")
ctx, flush := tester.NewContext(t)
defer flush()
suite.m365 = its.GetM365(t)
suite.dpnd = prepM365Test(t, ctx, path.TeamsChatsService)
suite.backupOps = make(map[path.CategoryType]string)
var (
teamschats = []string{suite.m365.User.ID}
ins = idname.NewCache(map[string]string{suite.m365.User.ID: suite.m365.User.ID})
cats = []path.CategoryType{
path.ChatsCategory,
}
)
for _, set := range cats {
var (
sel = selectors.NewTeamsChatsBackup(teamschats)
scopes []selectors.TeamsChatsScope
)
switch set {
case path.ChatsCategory:
scopes = selTD.TeamsChatsBackupChatScope(sel)
}
sel.Include(scopes)
bop, err := suite.dpnd.repo.NewBackupWithLookup(ctx, sel.Selector, ins)
require.NoError(t, err, clues.ToCore(err))
err = bop.Run(ctx)
require.NoError(t, err, clues.ToCore(err))
bIDs := string(bop.Results.BackupID)
// sanity check, ensure we can find the backup and its details immediately
b, err := suite.dpnd.repo.Backup(ctx, string(bop.Results.BackupID))
require.NoError(t, err, "retrieving recent backup by ID")
require.Equal(t, bIDs, string(b.ID), "repo backup matches results id")
_, b, errs := suite.dpnd.repo.GetBackupDetails(ctx, bIDs)
require.NoError(t, errs.Failure(), "retrieving recent backup details by ID")
require.Empty(t, errs.Recovered(), "retrieving recent backup details by ID")
require.Equal(t, bIDs, string(b.ID), "repo details matches results id")
suite.backupOps[set] = string(b.ID)
}
}
func (suite *PreparedBackupTeamsChatsE2ESuite) TestTeamsChatsListCmd_chats() {
runTeamsChatsListCmdTest(suite, path.ChatsCategory)
}
func runTeamsChatsListCmdTest(suite *PreparedBackupTeamsChatsE2ESuite, category path.CategoryType) {
suite.dpnd.recorder.Reset()
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd := cliTD.StubRootCmd(
"backup", "list", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath)
cli.BuildCommandTree(cmd)
cmd.SetOut(&suite.dpnd.recorder)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
// compare the output
result := suite.dpnd.recorder.String()
assert.Contains(t, result, suite.backupOps[category])
}
func (suite *PreparedBackupTeamsChatsE2ESuite) TestTeamsChatsListCmd_singleID_chats() {
runTeamsChatsListSingleCmdTest(suite, path.ChatsCategory)
}
func runTeamsChatsListSingleCmdTest(suite *PreparedBackupTeamsChatsE2ESuite, category path.CategoryType) {
suite.dpnd.recorder.Reset()
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
bID := suite.backupOps[category]
cmd := cliTD.StubRootCmd(
"backup", "list", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--backup", string(bID))
cli.BuildCommandTree(cmd)
cmd.SetOut(&suite.dpnd.recorder)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
// compare the output
result := suite.dpnd.recorder.String()
assert.Contains(t, result, bID)
}
func (suite *PreparedBackupTeamsChatsE2ESuite) TestTeamsChatsListCmd_badID() {
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd := cliTD.StubRootCmd(
"backup", "list", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--backup", "smarfs")
cli.BuildCommandTree(cmd)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.Error(t, err, clues.ToCore(err))
}
func (suite *PreparedBackupTeamsChatsE2ESuite) TestTeamsChatsDetailsCmd_chats() {
runTeamsChatsDetailsCmdTest(suite, path.ChatsCategory)
}
func runTeamsChatsDetailsCmdTest(suite *PreparedBackupTeamsChatsE2ESuite, category path.CategoryType) {
suite.dpnd.recorder.Reset()
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
bID := suite.backupOps[category]
// fetch the details from the repo first
deets, _, errs := suite.dpnd.repo.GetBackupDetails(ctx, string(bID))
require.NoError(t, errs.Failure(), clues.ToCore(errs.Failure()))
require.Empty(t, errs.Recovered())
cmd := cliTD.StubRootCmd(
"backup", "details", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--"+flags.BackupFN, string(bID))
cli.BuildCommandTree(cmd)
cmd.SetOut(&suite.dpnd.recorder)
ctx = print.SetRootCmd(ctx, cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
// compare the output
result := suite.dpnd.recorder.String()
i := 0
foundFolders := 0
for _, ent := range deets.Entries {
// Skip folders as they don't mean anything to the end teamschat.
if ent.Folder != nil {
foundFolders++
continue
}
suite.Run(fmt.Sprintf("detail %d", i), func() {
assert.Contains(suite.T(), result, ent.ShortRef)
})
i++
}
// We only backup the default folder for each category so there should be at
// least that folder (we don't make details entries for prefix folders).
assert.GreaterOrEqual(t, foundFolders, 1)
}
// ---------------------------------------------------------------------------
// tests for deleting backups
// ---------------------------------------------------------------------------
type BackupDeleteTeamsChatsE2ESuite struct {
tester.Suite
dpnd dependencies
backupOps [3]operations.BackupOperation
}
func TestBackupDeleteTeamsChatsE2ESuite(t *testing.T) {
suite.Run(t, &BackupDeleteTeamsChatsE2ESuite{
Suite: tester.NewE2ESuite(
t,
[][]string{storeTD.AWSStorageCredEnvs, tconfig.M365AcctCredEnvs}),
})
}
func (suite *BackupDeleteTeamsChatsE2ESuite) SetupSuite() {
t := suite.T()
t.Skip("not fully implemented")
ctx, flush := tester.NewContext(t)
defer flush()
suite.dpnd = prepM365Test(t, ctx, path.TeamsChatsService)
m365TeamsChatID := tconfig.M365TeamID(t)
teamschats := []string{m365TeamsChatID}
// some tests require an existing backup
sel := selectors.NewTeamsChatsBackup(teamschats)
sel.Include(selTD.TeamsChatsBackupChatScope(sel))
for i := 0; i < cap(suite.backupOps); i++ {
backupOp, err := suite.dpnd.repo.NewBackup(ctx, sel.Selector)
require.NoError(t, err, clues.ToCore(err))
suite.backupOps[i] = backupOp
err = suite.backupOps[i].Run(ctx)
require.NoError(t, err, clues.ToCore(err))
}
}
func (suite *BackupDeleteTeamsChatsE2ESuite) TestTeamsChatsBackupDeleteCmd() {
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd := cliTD.StubRootCmd(
"backup", "delete", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--"+flags.BackupIDsFN,
fmt.Sprintf("%s,%s",
string(suite.backupOps[0].Results.BackupID),
string(suite.backupOps[1].Results.BackupID)))
cli.BuildCommandTree(cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
// a follow-up details call should fail, due to the backup ID being deleted
cmd = cliTD.StubRootCmd(
"backup", "details", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--backups", string(suite.backupOps[0].Results.BackupID))
cli.BuildCommandTree(cmd)
err = cmd.ExecuteContext(ctx)
require.Error(t, err, clues.ToCore(err))
}
func (suite *BackupDeleteTeamsChatsE2ESuite) TestTeamsChatsBackupDeleteCmd_SingleID() {
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd := cliTD.StubRootCmd(
"backup", "delete", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--"+flags.BackupFN,
string(suite.backupOps[2].Results.BackupID))
cli.BuildCommandTree(cmd)
// run the command
err := cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err))
// a follow-up details call should fail, due to the backup ID being deleted
cmd = cliTD.StubRootCmd(
"backup", "details", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--backup", string(suite.backupOps[2].Results.BackupID))
cli.BuildCommandTree(cmd)
err = cmd.ExecuteContext(ctx)
require.Error(t, err, clues.ToCore(err))
}
func (suite *BackupDeleteTeamsChatsE2ESuite) TestTeamsChatsBackupDeleteCmd_UnknownID() {
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd := cliTD.StubRootCmd(
"backup", "delete", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath,
"--"+flags.BackupIDsFN, uuid.NewString())
cli.BuildCommandTree(cmd)
// unknown backupIDs should error since the modelStore can't find the backup
err := cmd.ExecuteContext(ctx)
require.Error(t, err, clues.ToCore(err))
}
func (suite *BackupDeleteTeamsChatsE2ESuite) TestTeamsChatsBackupDeleteCmd_NoBackupID() {
t := suite.T()
ctx, flush := tester.NewContext(t)
ctx = config.SetViper(ctx, suite.dpnd.vpr)
defer flush()
cmd := cliTD.StubRootCmd(
"backup", "delete", "chats",
"--"+flags.ConfigFileFN, suite.dpnd.configFilePath)
cli.BuildCommandTree(cmd)
// empty backupIDs should error since no data provided
err := cmd.ExecuteContext(ctx)
require.Error(t, err, clues.ToCore(err))
}
// ---------------------------------------------------------------------------
// helpers
// ---------------------------------------------------------------------------
func buildTeamsChatsBackupCmd(
ctx context.Context,
configFile, resource, category string,
recorder *strings.Builder,
) (*cobra.Command, context.Context) {
cmd := cliTD.StubRootCmd(
"backup", "create", "chats",
"--"+flags.ConfigFileFN, configFile,
"--"+flags.UserFN, resource,
"--"+flags.CategoryDataFN, category)
cli.BuildCommandTree(cmd)
cmd.SetOut(recorder)
return cmd, print.SetRootCmd(ctx, cmd)
}

View File

@ -0,0 +1,248 @@
package backup
import (
"testing"
"github.com/alcionai/clues"
"github.com/spf13/cobra"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/stretchr/testify/suite"
"github.com/alcionai/corso/src/cli/flags"
flagsTD "github.com/alcionai/corso/src/cli/flags/testdata"
cliTD "github.com/alcionai/corso/src/cli/testdata"
"github.com/alcionai/corso/src/cli/utils"
"github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/pkg/control"
)
type TeamsChatsUnitSuite struct {
tester.Suite
}
func TestTeamsChatsUnitSuite(t *testing.T) {
suite.Run(t, &TeamsChatsUnitSuite{Suite: tester.NewUnitSuite(t)})
}
func (suite *TeamsChatsUnitSuite) TestAddTeamsChatsCommands() {
expectUse := teamschatsServiceCommand
table := []struct {
name string
use string
expectUse string
expectShort string
expectRunE func(*cobra.Command, []string) error
}{
{
name: "create teamschats",
use: createCommand,
expectUse: expectUse + " " + teamschatsServiceCommandCreateUseSuffix,
expectShort: teamschatsCreateCmd().Short,
expectRunE: createTeamsChatsCmd,
},
{
name: "list teamschats",
use: listCommand,
expectUse: expectUse,
expectShort: teamschatsListCmd().Short,
expectRunE: listTeamsChatsCmd,
},
{
name: "details teamschats",
use: detailsCommand,
expectUse: expectUse + " " + teamschatsServiceCommandDetailsUseSuffix,
expectShort: teamschatsDetailsCmd().Short,
expectRunE: detailsTeamsChatsCmd,
},
{
name: "delete teamschats",
use: deleteCommand,
expectUse: expectUse + " " + teamschatsServiceCommandDeleteUseSuffix,
expectShort: teamschatsDeleteCmd().Short,
expectRunE: deleteTeamsChatsCmd,
},
}
for _, test := range table {
suite.Run(test.name, func() {
t := suite.T()
cmd := &cobra.Command{Use: test.use}
c := addTeamsChatsCommands(cmd)
require.NotNil(t, c)
cmds := cmd.Commands()
require.Len(t, cmds, 1)
child := cmds[0]
assert.Equal(t, test.expectUse, child.Use)
assert.Equal(t, test.expectShort, child.Short)
tester.AreSameFunc(t, test.expectRunE, child.RunE)
})
}
}
func (suite *TeamsChatsUnitSuite) TestValidateTeamsChatsBackupCreateFlags() {
table := []struct {
name string
cats []string
expect assert.ErrorAssertionFunc
}{
{
name: "none",
cats: []string{},
expect: assert.NoError,
},
{
name: "chats",
cats: []string{flags.DataChats},
expect: assert.NoError,
},
{
name: "all allowed",
cats: []string{
flags.DataChats,
},
expect: assert.NoError,
},
{
name: "bad inputs",
cats: []string{"foo"},
expect: assert.Error,
},
}
for _, test := range table {
suite.Run(test.name, func() {
err := validateTeamsChatsBackupCreateFlags([]string{"*"}, test.cats)
test.expect(suite.T(), err, clues.ToCore(err))
})
}
}
func (suite *TeamsChatsUnitSuite) TestBackupCreateFlags() {
t := suite.T()
cmd := cliTD.SetUpCmdHasFlags(
t,
&cobra.Command{Use: createCommand},
addTeamsChatsCommands,
[]cliTD.UseCobraCommandFn{
flags.AddAllProviderFlags,
flags.AddAllStorageFlags,
},
flagsTD.WithFlags(
teamschatsServiceCommand,
[]string{
"--" + flags.RunModeFN, flags.RunModeFlagTest,
"--" + flags.UserFN, flagsTD.FlgInputs(flagsTD.UsersInput),
"--" + flags.CategoryDataFN, flagsTD.FlgInputs(flagsTD.TeamsChatsCategoryDataInput),
},
flagsTD.PreparedGenericBackupFlags(),
flagsTD.PreparedProviderFlags(),
flagsTD.PreparedStorageFlags()))
opts := utils.MakeTeamsChatsOpts(cmd)
co := utils.Control()
backupOpts := utils.ParseBackupOptions()
// TODO(ashmrtn): Remove flag checks on control.Options to control.Backup once
// restore flags are switched over too and we no longer parse flags beyond
// connection info into control.Options.
assert.Equal(t, control.FailFast, backupOpts.FailureHandling)
assert.True(t, backupOpts.Incrementals.ForceFullEnumeration)
assert.True(t, backupOpts.Incrementals.ForceItemDataRefresh)
assert.Equal(t, control.FailFast, co.FailureHandling)
assert.True(t, co.ToggleFeatures.DisableIncrementals)
assert.True(t, co.ToggleFeatures.ForceItemDataDownload)
assert.ElementsMatch(t, flagsTD.UsersInput, opts.Users)
flagsTD.AssertGenericBackupFlags(t, cmd)
flagsTD.AssertProviderFlags(t, cmd)
flagsTD.AssertStorageFlags(t, cmd)
}
func (suite *TeamsChatsUnitSuite) TestBackupListFlags() {
t := suite.T()
cmd := cliTD.SetUpCmdHasFlags(
t,
&cobra.Command{Use: listCommand},
addTeamsChatsCommands,
[]cliTD.UseCobraCommandFn{
flags.AddAllProviderFlags,
flags.AddAllStorageFlags,
},
flagsTD.WithFlags(
teamschatsServiceCommand,
[]string{
"--" + flags.RunModeFN, flags.RunModeFlagTest,
"--" + flags.BackupFN, flagsTD.BackupInput,
},
flagsTD.PreparedBackupListFlags(),
flagsTD.PreparedProviderFlags(),
flagsTD.PreparedStorageFlags()))
assert.Equal(t, flagsTD.BackupInput, flags.BackupIDFV)
flagsTD.AssertBackupListFlags(t, cmd)
flagsTD.AssertProviderFlags(t, cmd)
flagsTD.AssertStorageFlags(t, cmd)
}
func (suite *TeamsChatsUnitSuite) TestBackupDetailsFlags() {
t := suite.T()
cmd := cliTD.SetUpCmdHasFlags(
t,
&cobra.Command{Use: detailsCommand},
addTeamsChatsCommands,
[]cliTD.UseCobraCommandFn{
flags.AddAllProviderFlags,
flags.AddAllStorageFlags,
},
flagsTD.WithFlags(
teamschatsServiceCommand,
[]string{
"--" + flags.RunModeFN, flags.RunModeFlagTest,
"--" + flags.BackupFN, flagsTD.BackupInput,
"--" + flags.SkipReduceFN,
},
flagsTD.PreparedTeamsChatsFlags(),
flagsTD.PreparedProviderFlags(),
flagsTD.PreparedStorageFlags()))
co := utils.Control()
assert.Equal(t, flagsTD.BackupInput, flags.BackupIDFV)
assert.True(t, co.SkipReduce)
flagsTD.AssertProviderFlags(t, cmd)
flagsTD.AssertStorageFlags(t, cmd)
flagsTD.AssertTeamsChatsFlags(t, cmd)
}
func (suite *TeamsChatsUnitSuite) TestBackupDeleteFlags() {
t := suite.T()
cmd := cliTD.SetUpCmdHasFlags(
t,
&cobra.Command{Use: deleteCommand},
addTeamsChatsCommands,
[]cliTD.UseCobraCommandFn{
flags.AddAllProviderFlags,
flags.AddAllStorageFlags,
},
flagsTD.WithFlags(
teamschatsServiceCommand,
[]string{
"--" + flags.RunModeFN, flags.RunModeFlagTest,
"--" + flags.BackupFN, flagsTD.BackupInput,
},
flagsTD.PreparedProviderFlags(),
flagsTD.PreparedStorageFlags()))
assert.Equal(t, flagsTD.BackupInput, flags.BackupIDFV)
flagsTD.AssertProviderFlags(t, cmd)
flagsTD.AssertStorageFlags(t, cmd)
}

View File

@ -7,7 +7,6 @@ import (
"github.com/alcionai/corso/src/cli/flags" "github.com/alcionai/corso/src/cli/flags"
"github.com/alcionai/corso/src/cli/utils" "github.com/alcionai/corso/src/cli/utils"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/selectors"
) )
// called by export.go to map subcommands to provider-specific handling. // called by export.go to map subcommands to provider-specific handling.
@ -51,7 +50,13 @@ corso export groups my-exports --backup 1234abcd-12ab-cd34-56de-1234abcd
# Export all files and folders in folder "Documents/Finance Reports" that were created before 2020 to /my-exports # Export all files and folders in folder "Documents/Finance Reports" that were created before 2020 to /my-exports
corso export groups my-exports --backup 1234abcd-12ab-cd34-56de-1234abcd \ corso export groups my-exports --backup 1234abcd-12ab-cd34-56de-1234abcd \
--folder "Documents/Finance Reports" --file-created-before 2020-01-01T00:00:00` --folder "Documents/Finance Reports" --file-created-before 2020-01-01T00:00:00
# Export all posts from a conversation with topic "hello world" from group mailbox's last backup to /my-exports
corso export groups my-exports --backup 1234abcd-12ab-cd34-56de-1234abcd --conversation "hello world"
# Export post with ID 98765abcdef from a conversation from group mailbox's last backup to /my-exports
corso export groups my-exports --backup 1234abcd-12ab-cd34-56de-1234abcd --conversation "hello world" --post 98765abcdef`
) )
// `corso export groups [<flag>...] <destination>` // `corso export groups [<flag>...] <destination>`
@ -93,10 +98,6 @@ func exportGroupsCmd(cmd *cobra.Command, args []string) error {
sel := utils.IncludeGroupsRestoreDataSelectors(ctx, opts) sel := utils.IncludeGroupsRestoreDataSelectors(ctx, opts)
utils.FilterGroupsRestoreInfoSelectors(sel, opts) utils.FilterGroupsRestoreInfoSelectors(sel, opts)
// TODO(pandeyabs): Exclude conversations from export since they are not
// supported yet. https://github.com/alcionai/corso/issues/4822
sel.Exclude(sel.Conversation(selectors.Any()))
acceptedGroupsFormatTypes := []string{ acceptedGroupsFormatTypes := []string{
string(control.DefaultFormat), string(control.DefaultFormat),
string(control.JSONFormat), string(control.JSONFormat),

View File

@ -45,7 +45,27 @@ corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
# Export all files in the "Documents" library to the current directory. # Export all files in the "Documents" library to the current directory.
corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \ corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--library Documents --folder "Display Templates/Style Sheets" .` --library Documents --folder "Display Templates/Style Sheets" .
# Export lists by their name(s)
corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list "list-name-1,list-name-2" .
# Export lists created after a given time
corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-created-after 2024-01-01T12:23:34 .
# Export lists created before a given time
corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-created-before 2024-01-01T12:23:34 .
# Export lists modified before a given time
corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-modified-before 2024-01-01T12:23:34 .
# Export lists modified after a given time
corso export sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-modified-after 2024-01-01T12:23:34 .`
) )
// `corso export sharepoint [<flag>...] <destination>` // `corso export sharepoint [<flag>...] <destination>`

View File

@ -28,13 +28,6 @@ func AddFilesystemFlags(cmd *cobra.Command) {
"", "",
"path to local or network storage") "path to local or network storage")
cobra.CheckErr(cmd.MarkFlagRequired(FilesystemPathFN)) cobra.CheckErr(cmd.MarkFlagRequired(FilesystemPathFN))
fs.BoolVar(
&SucceedIfExistsFV,
SucceedIfExistsFN,
false,
"Exit with success if the repo has already been initialized.")
cobra.CheckErr(fs.MarkHidden("succeed-if-exists"))
} }
func FilesystemFlagOverrides(cmd *cobra.Command) map[string]string { func FilesystemFlagOverrides(cmd *cobra.Command) map[string]string {

View File

@ -12,9 +12,8 @@ const (
AWSSessionTokenFN = "aws-session-token" AWSSessionTokenFN = "aws-session-token"
// Corso Flags // Corso Flags
PassphraseFN = "passphrase" PassphraseFN = "passphrase"
NewPassphraseFN = "new-passphrase" NewPassphraseFN = "new-passphrase"
SucceedIfExistsFN = "succeed-if-exists"
) )
var ( var (
@ -25,7 +24,6 @@ var (
AWSSessionTokenFV string AWSSessionTokenFV string
PassphraseFV string PassphraseFV string
NewPhasephraseFV string NewPhasephraseFV string
SucceedIfExistsFV bool
) )
// AddMultipleBackupIDsFlag adds the --backups flag. // AddMultipleBackupIDsFlag adds the --backups flag.

View File

@ -38,11 +38,6 @@ func AddS3BucketFlags(cmd *cobra.Command) {
fs.StringVar(&EndpointFV, EndpointFN, "", "S3 service endpoint.") fs.StringVar(&EndpointFV, EndpointFN, "", "S3 service endpoint.")
fs.BoolVar(&DoNotUseTLSFV, DoNotUseTLSFN, false, "Disable TLS (HTTPS)") fs.BoolVar(&DoNotUseTLSFV, DoNotUseTLSFN, false, "Disable TLS (HTTPS)")
fs.BoolVar(&DoNotVerifyTLSFV, DoNotVerifyTLSFN, false, "Disable TLS (HTTPS) certificate verification.") fs.BoolVar(&DoNotVerifyTLSFV, DoNotVerifyTLSFN, false, "Disable TLS (HTTPS) certificate verification.")
// In general, we don't want to expose this flag to users and have them mistake it
// for a broad-scale idempotency solution. We can un-hide it later the need arises.
fs.BoolVar(&SucceedIfExistsFV, SucceedIfExistsFN, false, "Exit with success if the repo has already been initialized.")
cobra.CheckErr(fs.MarkHidden("succeed-if-exists"))
} }
func S3FlagOverrides(cmd *cobra.Command) map[string]string { func S3FlagOverrides(cmd *cobra.Command) map[string]string {

View File

@ -18,7 +18,6 @@ const (
ListModifiedBeforeFN = "list-modified-before" ListModifiedBeforeFN = "list-modified-before"
ListCreatedAfterFN = "list-created-after" ListCreatedAfterFN = "list-created-after"
ListCreatedBeforeFN = "list-created-before" ListCreatedBeforeFN = "list-created-before"
AllowListsRestoreFN = "allow-lists-restore"
PageFolderFN = "page-folder" PageFolderFN = "page-folder"
PageFN = "page" PageFN = "page"
@ -35,7 +34,6 @@ var (
ListModifiedBeforeFV string ListModifiedBeforeFV string
ListCreatedAfterFV string ListCreatedAfterFV string
ListCreatedBeforeFV string ListCreatedBeforeFV string
AllowListsRestoreFV bool
PageFolderFV []string PageFolderFV []string
PageFV []string PageFV []string
@ -101,11 +99,6 @@ func AddSharePointDetailsAndRestoreFlags(cmd *cobra.Command) {
&ListCreatedBeforeFV, &ListCreatedBeforeFV,
ListCreatedBeforeFN, "", ListCreatedBeforeFN, "",
"Select lists created before this datetime.") "Select lists created before this datetime.")
fs.BoolVar(
&AllowListsRestoreFV,
AllowListsRestoreFN, false,
"enables lists restore if provided")
cobra.CheckErr(fs.MarkHidden(AllowListsRestoreFN))
// pages // pages

View File

@ -0,0 +1,13 @@
package flags
import (
"github.com/spf13/cobra"
)
const (
DataChats = "chats"
)
func AddTeamsChatsDetailsAndRestoreFlags(cmd *cobra.Command) {
// TODO: add details flags
}

View File

@ -21,6 +21,7 @@ var (
ExchangeCategoryDataInput = []string{"email", "events", "contacts"} ExchangeCategoryDataInput = []string{"email", "events", "contacts"}
SharepointCategoryDataInput = []string{"files", "lists", "pages"} SharepointCategoryDataInput = []string{"files", "lists", "pages"}
GroupsCategoryDataInput = []string{"files", "lists", "pages", "messages"} GroupsCategoryDataInput = []string{"files", "lists", "pages", "messages"}
TeamsChatsCategoryDataInput = []string{"chats"}
ChannelInput = []string{"channel1", "channel2"} ChannelInput = []string{"channel1", "channel2"}
MessageInput = []string{"message1", "message2"} MessageInput = []string{"message1", "message2"}

25
src/cli/flags/testdata/teamschats.go vendored Normal file
View File

@ -0,0 +1,25 @@
package testdata
import (
"testing"
"github.com/spf13/cobra"
)
func PreparedTeamsChatsFlags() []string {
return []string{
// FIXME: populate when adding filters
// "--" + flags.ChatCreatedAfterFN, ChatCreatedAfterInput,
// "--" + flags.ChatCreatedBeforeFN, ChatCreatedBeforeInput,
// "--" + flags.ChatLastMessageAfterFN, ChatLastMessageAfterInput,
// "--" + flags.ChatLastMessageBeforeFN, ChatLastMessageBeforeInput,
}
}
func AssertTeamsChatsFlags(t *testing.T, cmd *cobra.Command) {
// FIXME: populate when adding filters
// assert.Equal(t, ChatCreatedAfterInput, flags.ChatCreatedAfterFV)
// assert.Equal(t, ChatCreatedBeforeInput, flags.ChatCreatedBeforeFV)
// assert.Equal(t, ChatLastMessageAfterInput, flags.ChatLastMessageAfterFV)
// assert.Equal(t, ChatLastMessageBeforeInput, flags.ChatLastMessageBeforeFV)
}

View File

@ -133,7 +133,7 @@ func Pretty(ctx context.Context, a any) {
return return
} }
printPrettyJSON(getRootCmd(ctx).ErrOrStderr(), a) printPrettyJSON(ctx, getRootCmd(ctx).ErrOrStderr(), a)
} }
// PrettyJSON prettifies and prints the value. // PrettyJSON prettifies and prints the value.
@ -143,7 +143,7 @@ func PrettyJSON(ctx context.Context, p minimumPrintabler) {
return return
} }
outputJSON(getRootCmd(ctx).ErrOrStderr(), p, outputAsJSONDebug) outputJSON(ctx, getRootCmd(ctx).ErrOrStderr(), p, outputAsJSONDebug)
} }
// out is the testable core of exported print funcs // out is the testable core of exported print funcs
@ -193,56 +193,56 @@ type minimumPrintabler interface {
// Item prints the printable, according to the caller's requested format. // Item prints the printable, according to the caller's requested format.
func Item(ctx context.Context, p Printable) { func Item(ctx context.Context, p Printable) {
printItem(getRootCmd(ctx).OutOrStdout(), p) printItem(ctx, getRootCmd(ctx).OutOrStdout(), p)
} }
// print prints the printable items, // print prints the printable items,
// according to the caller's requested format. // according to the caller's requested format.
func printItem(w io.Writer, p Printable) { func printItem(ctx context.Context, w io.Writer, p Printable) {
if outputAsJSON || outputAsJSONDebug { if outputAsJSON || outputAsJSONDebug {
outputJSON(w, p, outputAsJSONDebug) outputJSON(ctx, w, p, outputAsJSONDebug)
return return
} }
outputTable(w, []Printable{p}) outputTable(ctx, w, []Printable{p})
} }
// ItemProperties prints the printable either as in a single line or a json // ItemProperties prints the printable either as in a single line or a json
// The difference between this and Item is that this one does not print the ID // The difference between this and Item is that this one does not print the ID
func ItemProperties(ctx context.Context, p Printable) { func ItemProperties(ctx context.Context, p Printable) {
printItemProperties(getRootCmd(ctx).OutOrStdout(), p) printItemProperties(ctx, getRootCmd(ctx).OutOrStdout(), p)
} }
// print prints the printable items, // print prints the printable items,
// according to the caller's requested format. // according to the caller's requested format.
func printItemProperties(w io.Writer, p Printable) { func printItemProperties(ctx context.Context, w io.Writer, p Printable) {
if outputAsJSON || outputAsJSONDebug { if outputAsJSON || outputAsJSONDebug {
outputJSON(w, p, outputAsJSONDebug) outputJSON(ctx, w, p, outputAsJSONDebug)
return return
} }
outputOneLine(w, []Printable{p}) outputOneLine(ctx, w, []Printable{p})
} }
// All prints the slice of printable items, // All prints the slice of printable items,
// according to the caller's requested format. // according to the caller's requested format.
func All(ctx context.Context, ps ...Printable) { func All(ctx context.Context, ps ...Printable) {
printAll(getRootCmd(ctx).OutOrStdout(), ps) printAll(ctx, getRootCmd(ctx).OutOrStdout(), ps)
} }
// printAll prints the slice of printable items, // printAll prints the slice of printable items,
// according to the caller's requested format. // according to the caller's requested format.
func printAll(w io.Writer, ps []Printable) { func printAll(ctx context.Context, w io.Writer, ps []Printable) {
if len(ps) == 0 { if len(ps) == 0 {
return return
} }
if outputAsJSON || outputAsJSONDebug { if outputAsJSON || outputAsJSONDebug {
outputJSONArr(w, ps, outputAsJSONDebug) outputJSONArr(ctx, w, ps, outputAsJSONDebug)
return return
} }
outputTable(w, ps) outputTable(ctx, w, ps)
} }
// ------------------------------------------------------------------------------------------ // ------------------------------------------------------------------------------------------
@ -252,11 +252,11 @@ func printAll(w io.Writer, ps []Printable) {
// Table writes the printables in a tabular format. Takes headers from // Table writes the printables in a tabular format. Takes headers from
// the 0th printable only. // the 0th printable only.
func Table(ctx context.Context, ps []Printable) { func Table(ctx context.Context, ps []Printable) {
outputTable(getRootCmd(ctx).OutOrStdout(), ps) outputTable(ctx, getRootCmd(ctx).OutOrStdout(), ps)
} }
// output to stdout the list of printable structs in a table // output to stdout the list of printable structs in a table
func outputTable(w io.Writer, ps []Printable) { func outputTable(ctx context.Context, w io.Writer, ps []Printable) {
t := table.Table{ t := table.Table{
Headers: ps[0].Headers(false), Headers: ps[0].Headers(false),
Rows: [][]string{}, Rows: [][]string{},
@ -266,6 +266,9 @@ func outputTable(w io.Writer, ps []Printable) {
t.Rows = append(t.Rows, p.Values(false)) t.Rows = append(t.Rows, p.Values(false))
} }
// observe bars needs to be flushed before printing
observe.Flush(ctx)
_ = t.WriteTable( _ = t.WriteTable(
w, w,
&table.Config{ &table.Config{
@ -279,20 +282,20 @@ func outputTable(w io.Writer, ps []Printable) {
// JSON // JSON
// ------------------------------------------------------------------------------------------ // ------------------------------------------------------------------------------------------
func outputJSON(w io.Writer, p minimumPrintabler, debug bool) { func outputJSON(ctx context.Context, w io.Writer, p minimumPrintabler, debug bool) {
if debug { if debug {
printJSON(w, p) printJSON(ctx, w, p)
return return
} }
if debug { if debug {
printJSON(w, p) printJSON(ctx, w, p)
} else { } else {
printJSON(w, p.MinimumPrintable()) printJSON(ctx, w, p.MinimumPrintable())
} }
} }
func outputJSONArr(w io.Writer, ps []Printable, debug bool) { func outputJSONArr(ctx context.Context, w io.Writer, ps []Printable, debug bool) {
sl := make([]any, 0, len(ps)) sl := make([]any, 0, len(ps))
for _, p := range ps { for _, p := range ps {
@ -303,11 +306,14 @@ func outputJSONArr(w io.Writer, ps []Printable, debug bool) {
} }
} }
printJSON(w, sl) printJSON(ctx, w, sl)
} }
// output to stdout the list of printable structs as json. // output to stdout the list of printable structs as json.
func printJSON(w io.Writer, a any) { func printJSON(ctx context.Context, w io.Writer, a any) {
// observe bars needs to be flushed before printing
observe.Flush(ctx)
bs, err := json.Marshal(a) bs, err := json.Marshal(a)
if err != nil { if err != nil {
fmt.Fprintf(w, "error formatting results to json: %v\n", err) fmt.Fprintf(w, "error formatting results to json: %v\n", err)
@ -318,7 +324,10 @@ func printJSON(w io.Writer, a any) {
} }
// output to stdout the list of printable structs as prettified json. // output to stdout the list of printable structs as prettified json.
func printPrettyJSON(w io.Writer, a any) { func printPrettyJSON(ctx context.Context, w io.Writer, a any) {
// observe bars needs to be flushed before printing
observe.Flush(ctx)
bs, err := json.MarshalIndent(a, "", " ") bs, err := json.MarshalIndent(a, "", " ")
if err != nil { if err != nil {
fmt.Fprintf(w, "error formatting results to json: %v\n", err) fmt.Fprintf(w, "error formatting results to json: %v\n", err)
@ -334,7 +343,10 @@ func printPrettyJSON(w io.Writer, a any) {
// Output in the following format: // Output in the following format:
// Bytes Uploaded: 401 kB | Items Uploaded: 59 | Items Skipped: 0 | Errors: 0 // Bytes Uploaded: 401 kB | Items Uploaded: 59 | Items Skipped: 0 | Errors: 0
func outputOneLine(w io.Writer, ps []Printable) { func outputOneLine(ctx context.Context, w io.Writer, ps []Printable) {
// observe bars needs to be flushed before printing
observe.Flush(ctx)
headers := ps[0].Headers(true) headers := ps[0].Headers(true)
rows := [][]string{} rows := [][]string{}

View File

@ -2,7 +2,6 @@ package repo
import ( import (
"github.com/alcionai/clues" "github.com/alcionai/clues"
"github.com/pkg/errors"
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/alcionai/corso/src/cli/flags" "github.com/alcionai/corso/src/cli/flags"
@ -110,10 +109,6 @@ func initFilesystemCmd(cmd *cobra.Command, args []string) error {
ric := repository.InitConfig{RetentionOpts: retentionOpts} ric := repository.InitConfig{RetentionOpts: retentionOpts}
if err = r.Initialize(ctx, ric); err != nil { if err = r.Initialize(ctx, ric); err != nil {
if flags.SucceedIfExistsFV && errors.Is(err, repository.ErrorRepoAlreadyExists) {
return nil
}
return Only(ctx, clues.Stack(ErrInitializingRepo, err)) return Only(ctx, clues.Stack(ErrInitializingRepo, err))
} }

View File

@ -5,7 +5,6 @@ import (
"testing" "testing"
"github.com/alcionai/clues" "github.com/alcionai/clues"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
"github.com/stretchr/testify/suite" "github.com/stretchr/testify/suite"
@ -82,9 +81,9 @@ func (suite *FilesystemE2ESuite) TestInitFilesystemCmd() {
err = cmd.ExecuteContext(ctx) err = cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
// a second initialization should result in an error // noop
err = cmd.ExecuteContext(ctx) err = cmd.ExecuteContext(ctx)
assert.ErrorIs(t, err, repository.ErrorRepoAlreadyExists, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
}) })
} }
} }

View File

@ -4,7 +4,6 @@ import (
"strings" "strings"
"github.com/alcionai/clues" "github.com/alcionai/clues"
"github.com/pkg/errors"
"github.com/spf13/cobra" "github.com/spf13/cobra"
"github.com/alcionai/corso/src/cli/flags" "github.com/alcionai/corso/src/cli/flags"
@ -132,10 +131,6 @@ func initS3Cmd(cmd *cobra.Command, args []string) error {
ric := repository.InitConfig{RetentionOpts: retentionOpts} ric := repository.InitConfig{RetentionOpts: retentionOpts}
if err = r.Initialize(ctx, ric); err != nil { if err = r.Initialize(ctx, ric); err != nil {
if flags.SucceedIfExistsFV && errors.Is(err, repository.ErrorRepoAlreadyExists) {
return nil
}
return Only(ctx, clues.Stack(ErrInitializingRepo, err)) return Only(ctx, clues.Stack(ErrInitializingRepo, err))
} }

View File

@ -89,9 +89,9 @@ func (suite *S3E2ESuite) TestInitS3Cmd() {
err = cmd.ExecuteContext(ctx) err = cmd.ExecuteContext(ctx)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
// a second initialization should result in an error // noop
err = cmd.ExecuteContext(ctx) err = cmd.ExecuteContext(ctx)
assert.ErrorIs(t, err, repository.ErrorRepoAlreadyExists, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
}) })
} }
} }
@ -116,8 +116,7 @@ func (suite *S3E2ESuite) TestInitMultipleTimes() {
"repo", "init", "s3", "repo", "init", "s3",
"--"+flags.ConfigFileFN, configFP, "--"+flags.ConfigFileFN, configFP,
"--bucket", cfg.Bucket, "--bucket", cfg.Bucket,
"--prefix", cfg.Prefix, "--prefix", cfg.Prefix)
"--succeed-if-exists")
cli.BuildCommandTree(cmd) cli.BuildCommandTree(cmd)
// run the command // run the command

View File

@ -6,7 +6,6 @@ import (
"github.com/alcionai/corso/src/cli/flags" "github.com/alcionai/corso/src/cli/flags"
"github.com/alcionai/corso/src/cli/utils" "github.com/alcionai/corso/src/cli/utils"
"github.com/alcionai/corso/src/pkg/dttm" "github.com/alcionai/corso/src/pkg/dttm"
"github.com/alcionai/corso/src/pkg/selectors"
) )
// called by restore.go to map subcommands to provider-specific handling. // called by restore.go to map subcommands to provider-specific handling.
@ -51,7 +50,27 @@ corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
# Restore all files in the "Documents" library. # Restore all files in the "Documents" library.
corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \ corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--library Documents --folder "Display Templates/Style Sheets" ` --library Documents --folder "Display Templates/Style Sheets"
# Restore lists by their name(s)
corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list "list-name-1,list-name-2"
# Restore lists created after a given time
corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-created-after 2024-01-01T12:23:34
# Restore lists created before a given time
corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-created-before 2024-01-01T12:23:34
# Restore lists modified before a given time
corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-modified-before 2024-01-01T12:23:34
# Restore lists modified after a given time
corso restore sharepoint --backup 1234abcd-12ab-cd34-56de-1234abcd \
--list-modified-after 2024-01-01T12:23:34`
) )
// `corso restore sharepoint [<flag>...]` // `corso restore sharepoint [<flag>...]`
@ -87,11 +106,6 @@ func restoreSharePointCmd(cmd *cobra.Command, args []string) error {
sel := utils.IncludeSharePointRestoreDataSelectors(ctx, opts) sel := utils.IncludeSharePointRestoreDataSelectors(ctx, opts)
utils.FilterSharePointRestoreInfoSelectors(sel, opts) utils.FilterSharePointRestoreInfoSelectors(sel, opts)
if !opts.AllowListsRestore {
// Exclude lists from restore since they are not supported yet.
sel.Exclude(sel.Lists(selectors.Any()))
}
return runRestore( return runRestore(
ctx, ctx,
cmd, cmd,

View File

@ -103,7 +103,6 @@ func (suite *FlagUnitSuite) TestAddS3BucketFlags() {
assert.Equal(t, "prefix1", flags.PrefixFV, flags.PrefixFN) assert.Equal(t, "prefix1", flags.PrefixFV, flags.PrefixFN)
assert.True(t, flags.DoNotUseTLSFV, flags.DoNotUseTLSFN) assert.True(t, flags.DoNotUseTLSFV, flags.DoNotUseTLSFN)
assert.True(t, flags.DoNotVerifyTLSFV, flags.DoNotVerifyTLSFN) assert.True(t, flags.DoNotVerifyTLSFV, flags.DoNotVerifyTLSFN)
assert.True(t, flags.SucceedIfExistsFV, flags.SucceedIfExistsFN)
}, },
} }
@ -116,7 +115,6 @@ func (suite *FlagUnitSuite) TestAddS3BucketFlags() {
"--" + flags.PrefixFN, "prefix1", "--" + flags.PrefixFN, "prefix1",
"--" + flags.DoNotUseTLSFN, "--" + flags.DoNotUseTLSFN,
"--" + flags.DoNotVerifyTLSFN, "--" + flags.DoNotVerifyTLSFN,
"--" + flags.SucceedIfExistsFN,
}) })
err := cmd.Execute() err := cmd.Execute()
@ -130,7 +128,6 @@ func (suite *FlagUnitSuite) TestFilesystemFlags() {
Use: "test", Use: "test",
Run: func(cmd *cobra.Command, args []string) { Run: func(cmd *cobra.Command, args []string) {
assert.Equal(t, "/tmp/test", flags.FilesystemPathFV, flags.FilesystemPathFN) assert.Equal(t, "/tmp/test", flags.FilesystemPathFV, flags.FilesystemPathFN)
assert.True(t, flags.SucceedIfExistsFV, flags.SucceedIfExistsFN)
assert.Equal(t, "tenantID", flags.AzureClientTenantFV, flags.AzureClientTenantFN) assert.Equal(t, "tenantID", flags.AzureClientTenantFV, flags.AzureClientTenantFN)
assert.Equal(t, "clientID", flags.AzureClientIDFV, flags.AzureClientIDFN) assert.Equal(t, "clientID", flags.AzureClientIDFV, flags.AzureClientIDFN)
assert.Equal(t, "secret", flags.AzureClientSecretFV, flags.AzureClientSecretFN) assert.Equal(t, "secret", flags.AzureClientSecretFV, flags.AzureClientSecretFN)
@ -143,7 +140,6 @@ func (suite *FlagUnitSuite) TestFilesystemFlags() {
cmd.SetArgs([]string{ cmd.SetArgs([]string{
"test", "test",
"--" + flags.FilesystemPathFN, "/tmp/test", "--" + flags.FilesystemPathFN, "/tmp/test",
"--" + flags.SucceedIfExistsFN,
"--" + flags.AzureClientIDFN, "clientID", "--" + flags.AzureClientIDFN, "clientID",
"--" + flags.AzureClientTenantFN, "tenantID", "--" + flags.AzureClientTenantFN, "tenantID",
"--" + flags.AzureClientSecretFN, "secret", "--" + flags.AzureClientSecretFN, "secret",

View File

@ -266,9 +266,14 @@ func IncludeGroupsRestoreDataSelectors(ctx context.Context, opts GroupsOpts) *se
opts.Conversations = selectors.Any() opts.Conversations = selectors.Any()
} }
// if no post is specified, select all posts in the conversation
if convPosts == 0 {
opts.Posts = selectors.Any()
}
// if no post is specified, only select conversations; // if no post is specified, only select conversations;
// otherwise, look for channel/message pairs // otherwise, look for conv/post pairs
if chanMsgs == 0 { if convs == 0 {
sel.Include(sel.Conversation(opts.Conversations)) sel.Include(sel.Conversation(opts.Conversations))
} else { } else {
sel.Include(sel.ConversationPosts(opts.Conversations, opts.Posts)) sel.Include(sel.ConversationPosts(opts.Conversations, opts.Posts))

View File

@ -30,7 +30,6 @@ type SharePointOpts struct {
ListModifiedBefore string ListModifiedBefore string
ListCreatedBefore string ListCreatedBefore string
ListCreatedAfter string ListCreatedAfter string
AllowListsRestore bool
PageFolder []string PageFolder []string
Page []string Page []string
@ -82,7 +81,6 @@ func MakeSharePointOpts(cmd *cobra.Command) SharePointOpts {
ListModifiedBefore: flags.ListModifiedBeforeFV, ListModifiedBefore: flags.ListModifiedBeforeFV,
ListCreatedAfter: flags.ListCreatedAfterFV, ListCreatedAfter: flags.ListCreatedAfterFV,
ListCreatedBefore: flags.ListCreatedBeforeFV, ListCreatedBefore: flags.ListCreatedBeforeFV,
AllowListsRestore: flags.AllowListsRestoreFV,
Page: flags.PageFV, Page: flags.PageFV,
PageFolder: flags.PageFolderFV, PageFolder: flags.PageFolderFV,
@ -106,7 +104,9 @@ func SharePointAllowedCategories() map[string]struct{} {
func AddCategories(sel *selectors.SharePointBackup, cats []string) *selectors.SharePointBackup { func AddCategories(sel *selectors.SharePointBackup, cats []string) *selectors.SharePointBackup {
if len(cats) == 0 { if len(cats) == 0 {
sel.Include(sel.LibraryFolders(selectors.Any()), sel.Lists(selectors.Any())) // [TODO](hitesh) to enable lists without being invoked explicitly via --data flag
// sel.Include(sel.LibraryFolders(selectors.Any()), sel.Lists(selectors.Any()))
sel.Include(sel.LibraryFolders(selectors.Any()))
} }
for _, d := range cats { for _, d := range cats {

View File

@ -420,7 +420,7 @@ func (suite *SharePointUtilsSuite) TestAddSharepointCategories() {
{ {
name: "none", name: "none",
cats: []string{}, cats: []string{},
expectScopeLen: 2, expectScopeLen: 1,
}, },
{ {
name: "libraries", name: "libraries",

101
src/cli/utils/teamschats.go Normal file
View File

@ -0,0 +1,101 @@
package utils
import (
"context"
"github.com/alcionai/clues"
"github.com/spf13/cobra"
"github.com/alcionai/corso/src/cli/flags"
"github.com/alcionai/corso/src/pkg/selectors"
)
type TeamsChatsOpts struct {
Users []string
ExportCfg ExportCfgOpts
Populated flags.PopulatedFlags
}
func TeamsChatsAllowedCategories() map[string]struct{} {
return map[string]struct{}{
flags.DataChats: {},
}
}
func AddTeamsChatsCategories(sel *selectors.TeamsChatsBackup, cats []string) *selectors.TeamsChatsBackup {
if len(cats) == 0 {
sel.Include(sel.AllData())
}
for _, d := range cats {
switch d {
case flags.DataChats:
sel.Include(sel.Chats(selectors.Any()))
}
}
return sel
}
func MakeTeamsChatsOpts(cmd *cobra.Command) TeamsChatsOpts {
return TeamsChatsOpts{
Users: flags.UserFV,
ExportCfg: makeExportCfgOpts(cmd),
// populated contains the list of flags that appear in the
// command, according to pflags. Use this to differentiate
// between an "empty" and a "missing" value.
Populated: flags.GetPopulatedFlags(cmd),
}
}
// ValidateTeamsChatsRestoreFlags checks common flags for correctness and interdependencies
func ValidateTeamsChatsRestoreFlags(backupID string, opts TeamsChatsOpts, isRestore bool) error {
if len(backupID) == 0 {
return clues.New("a backup ID is required")
}
// restore isn't currently supported
if isRestore {
return clues.New("restore not supported")
}
return nil
}
// AddTeamsChatsFilter adds the scope of the provided values to the selector's
// filter set
func AddTeamsChatsFilter(
sel *selectors.TeamsChatsRestore,
v string,
f func(string) []selectors.TeamsChatsScope,
) {
if len(v) == 0 {
return
}
sel.Filter(f(v))
}
// IncludeTeamsChatsRestoreDataSelectors builds the common data-selector
// inclusions for teamschats commands.
func IncludeTeamsChatsRestoreDataSelectors(ctx context.Context, opts TeamsChatsOpts) *selectors.TeamsChatsRestore {
users := opts.Users
if len(opts.Users) == 0 {
users = selectors.Any()
}
return selectors.NewTeamsChatsRestore(users)
}
// FilterTeamsChatsRestoreInfoSelectors builds the common info-selector filters.
func FilterTeamsChatsRestoreInfoSelectors(
sel *selectors.TeamsChatsRestore,
opts TeamsChatsOpts,
) {
// TODO: populate when adding filters
}

View File

@ -6,12 +6,6 @@ Param (
[Parameter(Mandatory = $False, HelpMessage = "Site for which to delete folders in SharePoint")] [Parameter(Mandatory = $False, HelpMessage = "Site for which to delete folders in SharePoint")]
[String]$Site, [String]$Site,
[Parameter(Mandatory = $False, HelpMessage = "Exchange Admin email")]
[String]$AdminUser = $ENV:M365_TENANT_ADMIN_USER,
[Parameter(Mandatory = $False, HelpMessage = "Exchange Admin password")]
[String]$AdminPwd = $ENV:M365_TENANT_ADMIN_PASSWORD,
[Parameter(Mandatory = $False, HelpMessage = "Document library root. Can add multiple comma-separated values")] [Parameter(Mandatory = $False, HelpMessage = "Document library root. Can add multiple comma-separated values")]
[String[]]$LibraryNameList = @(), [String[]]$LibraryNameList = @(),
@ -22,7 +16,16 @@ Param (
[String[]]$FolderPrefixPurgeList, [String[]]$FolderPrefixPurgeList,
[Parameter(Mandatory = $False, HelpMessage = "Delete document libraries with this prefix")] [Parameter(Mandatory = $False, HelpMessage = "Delete document libraries with this prefix")]
[String[]]$LibraryPrefixDeleteList = @() [String[]]$LibraryPrefixDeleteList = @(),
[Parameter(Mandatory = $False, HelpMessage = "Tenant domain")]
[String]$TenantDomain = $ENV:TENANT_DOMAIN,
[Parameter(Mandatory = $False, HelpMessage = "Azure ClientId")]
[String]$ClientId = $ENV:AZURE_CLIENT_ID,
[Parameter(Mandatory = $False, HelpMessage = "Azure AppCert")]
[String]$AppCert = $ENV:AZURE_APP_CERT
) )
Set-StrictMode -Version 2.0 Set-StrictMode -Version 2.0
@ -108,6 +111,7 @@ function Purge-Library {
$foldersToPurge = @() $foldersToPurge = @()
$folders = Get-PnPFolderItem -FolderSiteRelativeUrl $LibraryName -ItemType Folder $folders = Get-PnPFolderItem -FolderSiteRelativeUrl $LibraryName -ItemType Folder
Write-Host "`nFolders: $folders"
foreach ($f in $folders) { foreach ($f in $folders) {
$folderName = $f.Name $folderName = $f.Name
$createTime = Get-TimestampFromFolderName -Folder $f $createTime = Get-TimestampFromFolderName -Folder $f
@ -209,8 +213,8 @@ if (-not (Get-Module -ListAvailable -Name PnP.PowerShell)) {
} }
if ([string]::IsNullOrEmpty($AdminUser) -or [string]::IsNullOrEmpty($AdminPwd)) { if ([string]::IsNullOrEmpty($ClientId) -or [string]::IsNullOrEmpty($AppCert)) {
Write-Host "Admin user name and password required as arguments or environment variables." Write-Host "ClientId and AppCert required as arguments or environment variables."
Exit Exit
} }
@ -251,12 +255,8 @@ else {
Exit Exit
} }
$password = convertto-securestring -String $AdminPwd -AsPlainText -Force
$cred = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $AdminUser, $password
Write-Host "`nAuthenticating and connecting to $SiteUrl" Write-Host "`nAuthenticating and connecting to $SiteUrl"
Connect-PnPOnline -Url $siteUrl -Credential $cred Connect-PnPOnline -Url $siteUrl -ClientId $ClientId -CertificateBase64Encoded $AppCert -Tenant $TenantDomain
Write-Host "Connected to $siteUrl`n" Write-Host "Connected to $siteUrl`n"
# ensure that there are no unexpanded entries in the list of parameters # ensure that there are no unexpanded entries in the list of parameters

View File

@ -5,6 +5,7 @@ import (
"github.com/alcionai/clues" "github.com/alcionai/clues"
"github.com/microsoftgraph/msgraph-sdk-go/models" "github.com/microsoftgraph/msgraph-sdk-go/models"
"golang.org/x/exp/slices"
"github.com/alcionai/corso/src/cmd/sanity_test/common" "github.com/alcionai/corso/src/cmd/sanity_test/common"
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
@ -20,19 +21,20 @@ const (
// this increases the chance that we'll run into a race collision with // this increases the chance that we'll run into a race collision with
// the cleanup script. Sometimes that's okay (deleting old data that // the cleanup script. Sometimes that's okay (deleting old data that
// isn't scrutinized in the test), other times it's not. We mark whether // isn't scrutinized in the test), other times it's not. We mark whether
// that's okay to do or not by specifying the folder that's being // that's okay to do or not by specifying the folders being
// scrutinized for the test. Any errors within that folder should cause // scrutinized for the test. Any errors within those folders should cause
// a fatal exit. Errors outside of that folder get ignored. // a fatal exit. Errors outside of those folders get ignored.
// //
// since we're using folder names, requireNoErrorsWithinFolderName will // since we're using folder names, mustPopulateFolders will
// work best (ie: have the fewest collisions/side-effects) if the folder // work best (ie: have the fewest collisions/side-effects) if the folder
// name is very specific. Standard sanity tests should include timestamps, // names are very specific. Standard sanity tests should include timestamps,
// which should help ensure that. Be warned if you try to use it with // which should help ensure that. Be warned if you try to use it with
// a more generic name: unintended effects could occur. // a more generic name: unintended effects could occur.
func populateSanitree( func populateSanitree(
ctx context.Context, ctx context.Context,
ac api.Client, ac api.Client,
driveID, requireNoErrorsWithinFolderName string, driveID string,
mustPopulateFolders []string,
) *common.Sanitree[models.DriveItemable, models.DriveItemable] { ) *common.Sanitree[models.DriveItemable, models.DriveItemable] {
common.Infof(ctx, "building sanitree for drive: %s", driveID) common.Infof(ctx, "building sanitree for drive: %s", driveID)
@ -56,8 +58,8 @@ func populateSanitree(
ac, ac,
driveID, driveID,
stree.Name+"/", stree.Name+"/",
requireNoErrorsWithinFolderName, mustPopulateFolders,
rootName == requireNoErrorsWithinFolderName, slices.Contains(mustPopulateFolders, rootName),
stree) stree)
return stree return stree
@ -66,7 +68,9 @@ func populateSanitree(
func recursivelyBuildTree( func recursivelyBuildTree(
ctx context.Context, ctx context.Context,
ac api.Client, ac api.Client,
driveID, location, requireNoErrorsWithinFolderName string, driveID string,
location string,
mustPopulateFolders []string,
isChildOfFolderRequiringNoErrors bool, isChildOfFolderRequiringNoErrors bool,
stree *common.Sanitree[models.DriveItemable, models.DriveItemable], stree *common.Sanitree[models.DriveItemable, models.DriveItemable],
) { ) {
@ -80,9 +84,9 @@ func recursivelyBuildTree(
common.Infof( common.Infof(
ctx, ctx,
"ignoring error getting children in directory %q because it is not within directory %q\nerror: %s\n%+v", "ignoring error getting children in directory %q because it is not within directory set %v\nerror: %s\n%+v",
location, location,
requireNoErrorsWithinFolderName, mustPopulateFolders,
err.Error(), err.Error(),
clues.ToCore(err)) clues.ToCore(err))
@ -99,11 +103,12 @@ func recursivelyBuildTree(
// currently we don't restore blank folders. // currently we don't restore blank folders.
// skip permission check for empty folders // skip permission check for empty folders
if ptr.Val(driveItem.GetFolder().GetChildCount()) == 0 { if ptr.Val(driveItem.GetFolder().GetChildCount()) == 0 {
common.Infof(ctx, "skipped empty folder: %s/%s", location, itemName) common.Infof(ctx, "skipped empty folder: %s%s", location, itemName)
continue continue
} }
cannotAllowErrors := isChildOfFolderRequiringNoErrors || itemName == requireNoErrorsWithinFolderName cannotAllowErrors := isChildOfFolderRequiringNoErrors ||
slices.Contains(mustPopulateFolders, itemName)
branch := &common.Sanitree[models.DriveItemable, models.DriveItemable]{ branch := &common.Sanitree[models.DriveItemable, models.DriveItemable]{
Parent: stree, Parent: stree,
@ -124,7 +129,7 @@ func recursivelyBuildTree(
ac, ac,
driveID, driveID,
location+branch.Name+"/", location+branch.Name+"/",
requireNoErrorsWithinFolderName, mustPopulateFolders,
cannotAllowErrors, cannotAllowErrors,
branch) branch)
} }

View File

@ -32,7 +32,7 @@ func CheckExport(
ctx, ctx,
ac, ac,
driveID, driveID,
envs.RestoreContainer) []string{envs.SourceContainer})
sourceTree, ok := root.Children[envs.SourceContainer] sourceTree, ok := root.Children[envs.SourceContainer]
common.Assert( common.Assert(

View File

@ -45,7 +45,14 @@ func CheckRestoration(
"drive_id", driveID, "drive_id", driveID,
"drive_name", driveName) "drive_name", driveName)
root := populateSanitree(ctx, ac, driveID, envs.RestoreContainer) root := populateSanitree(
ctx,
ac,
driveID,
[]string{
envs.SourceContainer,
envs.RestoreContainer,
})
sourceTree, ok := root.Children[envs.SourceContainer] sourceTree, ok := root.Children[envs.SourceContainer]
common.Assert( common.Assert(

View File

@ -3,7 +3,7 @@ module github.com/alcionai/corso/src
go 1.21 go 1.21
replace ( replace (
github.com/kopia/kopia => github.com/alcionai/kopia v0.12.2-0.20240116215733-ec3d100029fe github.com/kopia/kopia => github.com/alcionai/kopia v0.12.2-0.20240322180947-41471159a0a4
// Alcion fork removes the validation of email addresses as we might get incomplete email addresses // Alcion fork removes the validation of email addresses as we might get incomplete email addresses
github.com/xhit/go-simple-mail/v2 v2.16.0 => github.com/alcionai/go-simple-mail/v2 v2.0.0-20231220071811-c70ebcd9a41a github.com/xhit/go-simple-mail/v2 v2.16.0 => github.com/alcionai/go-simple-mail/v2 v2.0.0-20231220071811-c70ebcd9a41a
@ -11,7 +11,7 @@ replace (
require ( require (
github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1 github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1
github.com/alcionai/clues v0.0.0-20231222002615-24ee69e6ecc2 github.com/alcionai/clues v0.0.0-20240125221452-9fc7746dd20c
github.com/armon/go-metrics v0.4.1 github.com/armon/go-metrics v0.4.1
github.com/aws/aws-xray-sdk-go v1.8.3 github.com/aws/aws-xray-sdk-go v1.8.3
github.com/cenkalti/backoff/v4 v4.2.1 github.com/cenkalti/backoff/v4 v4.2.1
@ -51,7 +51,7 @@ require (
) )
require ( require (
github.com/arran4/golang-ical v0.2.3 github.com/arran4/golang-ical v0.2.4
github.com/emersion/go-vcard v0.0.0-20230815062825-8fda7d206ec9 github.com/emersion/go-vcard v0.0.0-20230815062825-8fda7d206ec9
jaytaylor.com/html2text v0.0.0-20230321000545-74c2419ad056 jaytaylor.com/html2text v0.0.0-20230321000545-74c2419ad056
) )
@ -121,7 +121,7 @@ require (
github.com/mgutz/ansi v0.0.0-20200706080929-d51e80ef957d // indirect github.com/mgutz/ansi v0.0.0-20200706080929-d51e80ef957d // indirect
github.com/microsoft/kiota-serialization-text-go v1.0.0 github.com/microsoft/kiota-serialization-text-go v1.0.0
github.com/minio/md5-simd v1.1.2 // indirect github.com/minio/md5-simd v1.1.2 // indirect
github.com/minio/minio-go/v7 v7.0.66 github.com/minio/minio-go/v7 v7.0.67
github.com/minio/sha256-simd v1.0.1 // indirect github.com/minio/sha256-simd v1.0.1 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect github.com/modern-go/reflect2 v1.0.2 // indirect

View File

@ -19,12 +19,12 @@ github.com/VividCortex/ewma v1.2.0 h1:f58SaIzcDXrSy3kWaHNvuJgJ3Nmz59Zji6XoJR/q1o
github.com/VividCortex/ewma v1.2.0/go.mod h1:nz4BbCtbLyFDeC9SUHbtcT5644juEuWfUAUnGx7j5l4= github.com/VividCortex/ewma v1.2.0/go.mod h1:nz4BbCtbLyFDeC9SUHbtcT5644juEuWfUAUnGx7j5l4=
github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d h1:licZJFw2RwpHMqeKTCYkitsPqHNxTmd4SNR5r94FGM8= github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d h1:licZJFw2RwpHMqeKTCYkitsPqHNxTmd4SNR5r94FGM8=
github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d/go.mod h1:asat636LX7Bqt5lYEZ27JNDcqxfjdBQuJ/MM4CN/Lzo= github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d/go.mod h1:asat636LX7Bqt5lYEZ27JNDcqxfjdBQuJ/MM4CN/Lzo=
github.com/alcionai/clues v0.0.0-20231222002615-24ee69e6ecc2 h1:Oiz7puLziTpDUsEoiZMNor3j6um8RSvPOSIf4heGgTk= github.com/alcionai/clues v0.0.0-20240125221452-9fc7746dd20c h1:QtARFaqYKtGjmEejr07KFf2iyfCAdTxYGRAAFveLjFA=
github.com/alcionai/clues v0.0.0-20231222002615-24ee69e6ecc2/go.mod h1:1YJwJy3W6GGsC2UiDAEWABUjgvT8OZHjKs8OoaXeKbw= github.com/alcionai/clues v0.0.0-20240125221452-9fc7746dd20c/go.mod h1:1YJwJy3W6GGsC2UiDAEWABUjgvT8OZHjKs8OoaXeKbw=
github.com/alcionai/go-simple-mail/v2 v2.0.0-20231220071811-c70ebcd9a41a h1:4nhM0NM1qtUT1s55rQ+D0Xw1Re5mIU9/crjEl6KdE+k= github.com/alcionai/go-simple-mail/v2 v2.0.0-20231220071811-c70ebcd9a41a h1:4nhM0NM1qtUT1s55rQ+D0Xw1Re5mIU9/crjEl6KdE+k=
github.com/alcionai/go-simple-mail/v2 v2.0.0-20231220071811-c70ebcd9a41a/go.mod h1:b7P5ygho6SYE+VIqpxA6QkYfv4teeyG4MKqB3utRu98= github.com/alcionai/go-simple-mail/v2 v2.0.0-20231220071811-c70ebcd9a41a/go.mod h1:b7P5ygho6SYE+VIqpxA6QkYfv4teeyG4MKqB3utRu98=
github.com/alcionai/kopia v0.12.2-0.20240116215733-ec3d100029fe h1:nLS5pxhm04Jz4+qeipNlxdyPGxqNWpBu8UGkRYpWoIw= github.com/alcionai/kopia v0.12.2-0.20240322180947-41471159a0a4 h1:3YZ70H3mkUgwiHLiNvukrqh2awRgfl1RAkbV0IoUqqk=
github.com/alcionai/kopia v0.12.2-0.20240116215733-ec3d100029fe/go.mod h1:QFRSOUQzZfKE3hKVBHP7hxOn5WyrEmdBtfN5wkib/eA= github.com/alcionai/kopia v0.12.2-0.20240322180947-41471159a0a4/go.mod h1:QFRSOUQzZfKE3hKVBHP7hxOn5WyrEmdBtfN5wkib/eA=
github.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc= github.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc= github.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0= github.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
@ -35,8 +35,8 @@ github.com/andybalholm/brotli v1.0.6 h1:Yf9fFpf49Zrxb9NlQaluyE92/+X7UVHlhMNJN2sx
github.com/andybalholm/brotli v1.0.6/go.mod h1:fO7iG3H7G2nSZ7m0zPUDn85XEX2GTukHGRSepvi9Eig= github.com/andybalholm/brotli v1.0.6/go.mod h1:fO7iG3H7G2nSZ7m0zPUDn85XEX2GTukHGRSepvi9Eig=
github.com/armon/go-metrics v0.4.1 h1:hR91U9KYmb6bLBYLQjyM+3j+rcd/UhE+G78SFnF8gJA= github.com/armon/go-metrics v0.4.1 h1:hR91U9KYmb6bLBYLQjyM+3j+rcd/UhE+G78SFnF8gJA=
github.com/armon/go-metrics v0.4.1/go.mod h1:E6amYzXo6aW1tqzoZGT755KkbgrJsSdpwZ+3JqfkOG4= github.com/armon/go-metrics v0.4.1/go.mod h1:E6amYzXo6aW1tqzoZGT755KkbgrJsSdpwZ+3JqfkOG4=
github.com/arran4/golang-ical v0.2.3 h1:C4Vj7+BjJBIrAJhHgi6Ku+XUkQVugRq4re5Cqj5QVdE= github.com/arran4/golang-ical v0.2.4 h1:0/rTXn2qqEekLKec3SzRRy+z7pCLtniMb0KD/dPogUo=
github.com/arran4/golang-ical v0.2.3/go.mod h1:RqMuPGmwRRwjkb07hmm+JBqcWa1vF1LvVmPtSZN2OhQ= github.com/arran4/golang-ical v0.2.4/go.mod h1:RqMuPGmwRRwjkb07hmm+JBqcWa1vF1LvVmPtSZN2OhQ=
github.com/aws/aws-sdk-go v1.48.6 h1:hnL/TE3eRigirDLrdRE9AWE1ALZSVLAsC4wK8TGsMqk= github.com/aws/aws-sdk-go v1.48.6 h1:hnL/TE3eRigirDLrdRE9AWE1ALZSVLAsC4wK8TGsMqk=
github.com/aws/aws-sdk-go v1.48.6/go.mod h1:LF8svs817+Nz+DmiMQKTO3ubZ/6IaTpq3TjupRn3Eqk= github.com/aws/aws-sdk-go v1.48.6/go.mod h1:LF8svs817+Nz+DmiMQKTO3ubZ/6IaTpq3TjupRn3Eqk=
github.com/aws/aws-xray-sdk-go v1.8.3 h1:S8GdgVncBRhzbNnNUgTPwhEqhwt2alES/9rLASyhxjU= github.com/aws/aws-xray-sdk-go v1.8.3 h1:S8GdgVncBRhzbNnNUgTPwhEqhwt2alES/9rLASyhxjU=
@ -219,8 +219,8 @@ github.com/microsoftgraph/msgraph-sdk-go-core v1.0.1 h1:uq4qZD8VXLiNZY0t4NoRpLDo
github.com/microsoftgraph/msgraph-sdk-go-core v1.0.1/go.mod h1:HUITyuFN556+0QZ/IVfH5K4FyJM7kllV6ExKi2ImKhE= github.com/microsoftgraph/msgraph-sdk-go-core v1.0.1/go.mod h1:HUITyuFN556+0QZ/IVfH5K4FyJM7kllV6ExKi2ImKhE=
github.com/minio/md5-simd v1.1.2 h1:Gdi1DZK69+ZVMoNHRXJyNcxrMA4dSxoYHZSQbirFg34= github.com/minio/md5-simd v1.1.2 h1:Gdi1DZK69+ZVMoNHRXJyNcxrMA4dSxoYHZSQbirFg34=
github.com/minio/md5-simd v1.1.2/go.mod h1:MzdKDxYpY2BT9XQFocsiZf/NKVtR7nkE4RoEpN+20RM= github.com/minio/md5-simd v1.1.2/go.mod h1:MzdKDxYpY2BT9XQFocsiZf/NKVtR7nkE4RoEpN+20RM=
github.com/minio/minio-go/v7 v7.0.66 h1:bnTOXOHjOqv/gcMuiVbN9o2ngRItvqE774dG9nq0Dzw= github.com/minio/minio-go/v7 v7.0.67 h1:BeBvZWAS+kRJm1vGTMJYVjKUNoo0FoEt/wUWdUtfmh8=
github.com/minio/minio-go/v7 v7.0.66/go.mod h1:DHAgmyQEGdW3Cif0UooKOyrT3Vxs82zNdV6tkKhRtbs= github.com/minio/minio-go/v7 v7.0.67/go.mod h1:+UXocnUeZ3wHvVh5s95gcrA4YjMIbccT6ubB+1m054A=
github.com/minio/sha256-simd v1.0.1 h1:6kaan5IFmwTNynnKKpDHe6FWHohJOHhCPchzK49dzMM= github.com/minio/sha256-simd v1.0.1 h1:6kaan5IFmwTNynnKKpDHe6FWHohJOHhCPchzK49dzMM=
github.com/minio/sha256-simd v1.0.1/go.mod h1:Pz6AKMiUdngCLpeTL/RJY1M9rUuPMYujV5xJjtbRSN8= github.com/minio/sha256-simd v1.0.1/go.mod h1:Pz6AKMiUdngCLpeTL/RJY1M9rUuPMYujV5xJjtbRSN8=
github.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY= github.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY=

View File

@ -10,6 +10,7 @@ import (
"github.com/alcionai/corso/src/pkg/dttm" "github.com/alcionai/corso/src/pkg/dttm"
"github.com/alcionai/corso/src/pkg/export" "github.com/alcionai/corso/src/pkg/export"
"github.com/alcionai/corso/src/pkg/logger"
) )
const ( const (
@ -56,12 +57,22 @@ func ZipExportCollection(
defer wr.Close() defer wr.Close()
buf := make([]byte, ZipCopyBufferSize) buf := make([]byte, ZipCopyBufferSize)
counted := 0
log := logger.Ctx(ctx).
With("collection_count", len(expCollections))
for _, ec := range expCollections { for _, ec := range expCollections {
folder := ec.BasePath() folder := ec.BasePath()
items := ec.Items(ctx) items := ec.Items(ctx)
for item := range items { for item := range items {
counted++
// Log every 1000 items that are processed
if counted%1000 == 0 {
log.Infow("progress zipping export items", "count_items", counted)
}
err := item.Error err := item.Error
if err != nil { if err != nil {
writer.CloseWithError(clues.Wrap(err, "getting export item").With("id", item.ID)) writer.CloseWithError(clues.Wrap(err, "getting export item").With("id", item.ID))
@ -88,8 +99,12 @@ func ZipExportCollection(
writer.CloseWithError(clues.Wrap(err, "writing zip entry").With("name", name).With("id", item.ID)) writer.CloseWithError(clues.Wrap(err, "writing zip entry").With("name", name).With("id", item.ID))
return return
} }
item.Body.Close()
} }
} }
log.Infow("completed zipping export items", "count_items", counted)
}() }()
return zipCollection{reader}, nil return zipCollection{reader}, nil

View File

@ -1,10 +1,13 @@
package jwt package jwt
import ( import (
"context"
"time" "time"
"github.com/alcionai/clues" "github.com/alcionai/clues"
jwt "github.com/golang-jwt/jwt/v5" jwt "github.com/golang-jwt/jwt/v5"
"github.com/alcionai/corso/src/pkg/logger"
) )
// IsJWTExpired checks if the JWT token is past expiry by analyzing the // IsJWTExpired checks if the JWT token is past expiry by analyzing the
@ -37,3 +40,51 @@ func IsJWTExpired(
return expired, nil return expired, nil
} }
// GetJWTLifetime returns the issued at(iat) and expiration time(exp) claims
// present in the JWT token. These are optional claims and may not be present
// in the token. Absence is not reported as an error.
//
// An error is returned if the supplied token is malformed. Times are returned
// in UTC to have parity with graph responses.
func GetJWTLifetime(
ctx context.Context,
rawToken string,
) (time.Time, time.Time, error) {
var (
issuedAt time.Time
expiresAt time.Time
)
p := jwt.NewParser()
token, _, err := p.ParseUnverified(rawToken, &jwt.RegisteredClaims{})
if err != nil {
logger.CtxErr(ctx, err).Debug("parsing jwt token")
return time.Time{}, time.Time{}, clues.Wrap(err, "invalid jwt")
}
exp, err := token.Claims.GetExpirationTime()
if err != nil {
logger.CtxErr(ctx, err).Debug("extracting exp claim")
return time.Time{}, time.Time{}, clues.Wrap(err, "getting token expiry time")
}
iat, err := token.Claims.GetIssuedAt()
if err != nil {
logger.CtxErr(ctx, err).Debug("extracting iat claim")
return time.Time{}, time.Time{}, clues.Wrap(err, "getting token issued at time")
}
// Absence of iat or exp claims is not reported as an error by jwt library as these
// are optional as per spec.
if iat != nil {
issuedAt = iat.UTC()
}
if exp != nil {
expiresAt = exp.UTC()
}
return issuedAt, expiresAt, nil
}

View File

@ -113,3 +113,134 @@ func (suite *JWTUnitSuite) TestIsJWTExpired() {
}) })
} }
} }
func (suite *JWTUnitSuite) TestGetJWTLifetime() {
// Set of time values to be used in the tests.
// Truncate to seconds for comparisons since jwt tokens have second
// level precision.
idToTime := map[string]time.Time{
"T0": time.Now().UTC().Add(-time.Hour).Truncate(time.Second),
"T1": time.Now().UTC().Truncate(time.Second),
"T2": time.Now().UTC().Add(time.Hour).Truncate(time.Second),
}
table := []struct {
name string
getToken func() (string, error)
expectFunc func(t *testing.T, iat time.Time, exp time.Time)
expectErr assert.ErrorAssertionFunc
}{
{
name: "alive token",
getToken: func() (string, error) {
return createJWTToken(
jwt.RegisteredClaims{
IssuedAt: jwt.NewNumericDate(idToTime["T0"]),
ExpiresAt: jwt.NewNumericDate(idToTime["T1"]),
})
},
expectFunc: func(t *testing.T, iat time.Time, exp time.Time) {
assert.Equal(t, idToTime["T0"], iat)
assert.Equal(t, idToTime["T1"], exp)
},
expectErr: assert.NoError,
},
// Test with a token which is not generated using the go-jwt lib.
// This is a long lived token which is valid for 100 years.
{
name: "alive raw token with iat and exp claims",
getToken: func() (string, error) {
return rawToken, nil
},
expectFunc: func(t *testing.T, iat time.Time, exp time.Time) {
assert.Less(t, iat, time.Now(), "iat should be in the past")
assert.Greater(t, exp, time.Now(), "exp should be in the future")
},
expectErr: assert.NoError,
},
// Regardless of whether the token is expired or not, we should be able to
// extract the iat and exp claims from it without error.
{
name: "expired token",
getToken: func() (string, error) {
return createJWTToken(
jwt.RegisteredClaims{
IssuedAt: jwt.NewNumericDate(idToTime["T1"]),
ExpiresAt: jwt.NewNumericDate(idToTime["T0"]),
})
},
expectFunc: func(t *testing.T, iat time.Time, exp time.Time) {
assert.Equal(t, idToTime["T1"], iat)
assert.Equal(t, idToTime["T0"], exp)
},
expectErr: assert.NoError,
},
{
name: "missing iat claim",
getToken: func() (string, error) {
return createJWTToken(
jwt.RegisteredClaims{
ExpiresAt: jwt.NewNumericDate(idToTime["T2"]),
})
},
expectFunc: func(t *testing.T, iat time.Time, exp time.Time) {
assert.Equal(t, time.Time{}, iat)
assert.Equal(t, idToTime["T2"], exp)
},
expectErr: assert.NoError,
},
{
name: "missing exp claim",
getToken: func() (string, error) {
return createJWTToken(
jwt.RegisteredClaims{
IssuedAt: jwt.NewNumericDate(idToTime["T0"]),
})
},
expectFunc: func(t *testing.T, iat time.Time, exp time.Time) {
assert.Equal(t, idToTime["T0"], iat)
assert.Equal(t, time.Time{}, exp)
},
expectErr: assert.NoError,
},
{
name: "both claims missing",
getToken: func() (string, error) {
return createJWTToken(jwt.RegisteredClaims{})
},
expectFunc: func(t *testing.T, iat time.Time, exp time.Time) {
assert.Equal(t, time.Time{}, iat)
assert.Equal(t, time.Time{}, exp)
},
expectErr: assert.NoError,
},
{
name: "malformed token",
getToken: func() (string, error) {
return "header.claims.signature", nil
},
expectFunc: func(t *testing.T, iat time.Time, exp time.Time) {
assert.Equal(t, time.Time{}, iat)
assert.Equal(t, time.Time{}, exp)
},
expectErr: assert.Error,
},
}
for _, test := range table {
suite.Run(test.name, func() {
t := suite.T()
ctx, flush := tester.NewContext(t)
defer flush()
token, err := test.getToken()
require.NoError(t, err)
iat, exp, err := GetJWTLifetime(ctx, token)
test.expectErr(t, err)
test.expectFunc(t, iat, exp)
})
}
}

View File

@ -59,6 +59,19 @@ func First(vs ...string) string {
return "" return ""
} }
// FirstIn returns the first entry in the map with a non-zero value
// when iterating the provided list of keys.
func FirstIn(m map[string]any, keys ...string) string {
for _, key := range keys {
v, err := AnyValueToString(key, m)
if err == nil && len(v) > 0 {
return v
}
}
return ""
}
// Preview reduces the string to the specified size. // Preview reduces the string to the specified size.
// If the string is longer than the size, the last three // If the string is longer than the size, the last three
// characters are replaced with an ellipsis. Size < 4 // characters are replaced with an ellipsis. Size < 4

View File

@ -118,3 +118,96 @@ func TestGenerateHash(t *testing.T) {
} }
} }
} }
func TestFirstIn(t *testing.T) {
table := []struct {
name string
m map[string]any
keys []string
expect string
}{
{
name: "nil map",
keys: []string{"foo", "bar"},
expect: "",
},
{
name: "empty map",
m: map[string]any{},
keys: []string{"foo", "bar"},
expect: "",
},
{
name: "no match",
m: map[string]any{
"baz": "baz",
},
keys: []string{"foo", "bar"},
expect: "",
},
{
name: "no keys",
m: map[string]any{
"baz": "baz",
},
keys: []string{},
expect: "",
},
{
name: "nil match",
m: map[string]any{
"foo": nil,
},
keys: []string{"foo", "bar"},
expect: "",
},
{
name: "empty match",
m: map[string]any{
"foo": "",
},
keys: []string{"foo", "bar"},
expect: "",
},
{
name: "matches first key",
m: map[string]any{
"foo": "fnords",
},
keys: []string{"foo", "bar"},
expect: "fnords",
},
{
name: "matches second key",
m: map[string]any{
"bar": "smarf",
},
keys: []string{"foo", "bar"},
expect: "smarf",
},
{
name: "matches second key with nil first match",
m: map[string]any{
"foo": nil,
"bar": "smarf",
},
keys: []string{"foo", "bar"},
expect: "smarf",
},
{
name: "matches second key with empty first match",
m: map[string]any{
"foo": "",
"bar": "smarf",
},
keys: []string{"foo", "bar"},
expect: "smarf",
},
}
for _, test := range table {
t.Run(test.name, func(t *testing.T) {
result := FirstIn(test.m, test.keys...)
assert.Equal(t, test.expect, result)
})
}
}

View File

@ -23,6 +23,7 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/common/str" "github.com/alcionai/corso/src/internal/common/str"
"github.com/alcionai/corso/src/internal/converters/ics" "github.com/alcionai/corso/src/internal/converters/ics"
"github.com/alcionai/corso/src/internal/m365/collection/groups/metadata"
"github.com/alcionai/corso/src/pkg/logger" "github.com/alcionai/corso/src/pkg/logger"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
) )
@ -142,6 +143,121 @@ func getICalData(ctx context.Context, data models.Messageable) (string, error) {
return ics.FromEventable(ctx, event) return ics.FromEventable(ctx, event)
} }
func getFileAttachment(ctx context.Context, attachment models.Attachmentable) (*mail.File, error) {
kind := ptr.Val(attachment.GetContentType())
bytes, err := attachment.GetBackingStore().Get("contentBytes")
if err != nil {
return nil, clues.WrapWC(ctx, err, "failed to get attachment bytes").
With("kind", kind)
}
if bytes == nil {
// TODO(meain): Handle non file attachments
// https://github.com/alcionai/corso/issues/4772
logger.Ctx(ctx).
With("attachment_id", ptr.Val(attachment.GetId()),
"attachment_type", ptr.Val(attachment.GetOdataType())).
Info("no contentBytes for attachment")
return nil, nil
}
bts, ok := bytes.([]byte)
if !ok {
return nil, clues.WrapWC(ctx, err, "invalid content bytes").
With("kind", kind).
With("interface_type", fmt.Sprintf("%T", bytes))
}
name := ptr.Val(attachment.GetName())
if len(name) == 0 {
// Graph as of now does not let us create any attachments
// without a name, but we have run into instances where we have
// see attachments without a name, possibly from old
// data. This is for those cases.
name = "Unnamed"
}
contentID, err := attachment.GetBackingStore().Get("contentId")
if err != nil {
return nil, clues.WrapWC(ctx, err, "getting content id for attachment").
With("kind", kind)
}
if contentID != nil {
cids, _ := str.AnyToString(contentID)
if len(cids) > 0 {
name = cids
}
}
return &mail.File{
// cannot use filename as inline attachment will not get mapped properly
Name: name,
MimeType: kind,
Data: bts,
Inline: ptr.Val(attachment.GetIsInline()),
}, nil
}
func getItemAttachment(ctx context.Context, attachment models.Attachmentable) (*mail.File, error) {
it, err := attachment.GetBackingStore().Get("item")
if err != nil {
return nil, clues.WrapWC(ctx, err, "getting item for attachment").
With("attachment_id", ptr.Val(attachment.GetId()))
}
name := ptr.Val(attachment.GetName())
if len(name) == 0 {
// Graph as of now does not let us create any attachments
// without a name, but we have run into instances where we have
// see attachments without a name, possibly from old
// data. This is for those cases.
name = "Unnamed"
}
switch it := it.(type) {
case *models.Message:
cb, err := FromMessageable(ctx, it)
if err != nil {
return nil, clues.WrapWC(ctx, err, "converting item attachment to eml").
With("attachment_id", ptr.Val(attachment.GetId()))
}
return &mail.File{
Name: name,
MimeType: "message/rfc822",
Data: []byte(cb),
}, nil
default:
logger.Ctx(ctx).
With("attachment_id", ptr.Val(attachment.GetId()),
"attachment_type", ptr.Val(attachment.GetOdataType())).
Info("unknown item attachment type")
}
return nil, nil
}
func getMailAttachment(ctx context.Context, att models.Attachmentable) (*mail.File, error) {
otyp := ptr.Val(att.GetOdataType())
switch otyp {
case "#microsoft.graph.fileAttachment":
return getFileAttachment(ctx, att)
case "#microsoft.graph.itemAttachment":
return getItemAttachment(ctx, att)
default:
logger.Ctx(ctx).
With("attachment_id", ptr.Val(att.GetId()),
"attachment_type", otyp).
Info("unknown attachment type")
return nil, nil
}
}
// FromJSON converts a Messageable (as json) to .eml format // FromJSON converts a Messageable (as json) to .eml format
func FromJSON(ctx context.Context, body []byte) (string, error) { func FromJSON(ctx context.Context, body []byte) (string, error) {
ctx = clues.Add(ctx, "body_len", len(body)) ctx = clues.Add(ctx, "body_len", len(body))
@ -151,6 +267,11 @@ func FromJSON(ctx context.Context, body []byte) (string, error) {
return "", clues.WrapWC(ctx, err, "converting to messageble") return "", clues.WrapWC(ctx, err, "converting to messageble")
} }
return FromMessageable(ctx, data)
}
// Converts a Messageable to .eml format
func FromMessageable(ctx context.Context, data models.Messageable) (string, error) {
ctx = clues.Add(ctx, "item_id", ptr.Val(data.GetId())) ctx = clues.Add(ctx, "item_id", ptr.Val(data.GetId()))
email := mail.NewMSG() email := mail.NewMSG()
@ -226,6 +347,115 @@ func FromJSON(ctx context.Context, body []byte) (string, error) {
} }
} }
if data.GetAttachments() != nil {
for _, attachment := range data.GetAttachments() {
att, err := getMailAttachment(ctx, attachment)
if err != nil {
return "", clues.WrapWC(ctx, err, "getting mail attachment")
}
// There are known cases where we just wanna log and
// ignore instead of erroring out
if att != nil {
email.Attach(att)
}
}
}
switch data.(type) {
case *models.EventMessageResponse, *models.EventMessage:
// We can't handle this as of now, not enough information
// TODO: Fetch event object from graph when fetching email
case *models.CalendarSharingMessage:
// TODO: Parse out calendar sharing message
// https://github.com/alcionai/corso/issues/5041
case *models.EventMessageRequest:
cal, err := getICalData(ctx, data)
if err != nil {
return "", clues.Wrap(err, "getting ical attachment")
}
if len(cal) > 0 {
email.AddAlternative(mail.TextCalendar, cal)
}
}
if err := email.GetError(); err != nil {
return "", clues.WrapWC(ctx, err, "converting to eml")
}
return email.GetMessage(), nil
}
//-------------------------------------------------------------
// Postable -> EML
//-------------------------------------------------------------
// FromJSONPostToEML converts a postable (as json) to .eml format.
// TODO(pandeyabs): This is a stripped down copy of messageable to
// eml conversion, it can be folded into one function by having a post
// to messageable converter.
func FromJSONPostToEML(
ctx context.Context,
body []byte,
postMetadata metadata.ConversationPostMetadata,
) (string, error) {
ctx = clues.Add(ctx, "body_len", len(body))
data, err := api.BytesToPostable(body)
if err != nil {
return "", clues.WrapWC(ctx, err, "converting to postable")
}
ctx = clues.Add(ctx, "item_id", ptr.Val(data.GetId()))
email := mail.NewMSG()
email.Encoding = mail.EncodingBase64 // Doing it to be safe for when we have eventMessage (newline issues)
email.AllowDuplicateAddress = true // More "correct" conversion
email.AddBccToHeader = true // Don't ignore Bcc
email.AllowEmptyAttachments = true // Don't error on empty attachments
email.UseProvidedAddress = true // Don't try to parse the email address
if data.GetFrom() != nil {
email.SetFrom(formatAddress(data.GetFrom().GetEmailAddress()))
}
// We don't have the To, Cc, Bcc recipient information for posts due to a graph
// limitation. All posts carry the group email address as the only recipient
// for now.
email.AddTo(postMetadata.Recipients...)
email.SetSubject(postMetadata.Topic)
// Reply-To email address is not available for posts. Note that this is different
// from inReplyTo field.
if data.GetCreatedDateTime() != nil {
email.SetDate(ptr.Val(data.GetCreatedDateTime()).Format(dateFormat))
}
if data.GetBody() != nil {
if data.GetBody().GetContentType() != nil {
var contentType mail.ContentType
switch data.GetBody().GetContentType().String() {
case "html":
contentType = mail.TextHTML
case "text":
contentType = mail.TextPlain
default:
// https://learn.microsoft.com/en-us/graph/api/resources/itembody?view=graph-rest-1.0#properties
// This should not be possible according to the documentation
logger.Ctx(ctx).
With("body_type", data.GetBody().GetContentType().String()).
Info("unknown body content type")
contentType = mail.TextPlain
}
email.SetBody(contentType, ptr.Val(data.GetBody().GetContent()))
}
}
if data.GetAttachments() != nil { if data.GetAttachments() != nil {
for _, attachment := range data.GetAttachments() { for _, attachment := range data.GetAttachments() {
kind := ptr.Val(attachment.GetContentType()) kind := ptr.Val(attachment.GetContentType())
@ -239,6 +469,9 @@ func FromJSON(ctx context.Context, body []byte) (string, error) {
if bytes == nil { if bytes == nil {
// TODO(meain): Handle non file attachments // TODO(meain): Handle non file attachments
// https://github.com/alcionai/corso/issues/4772 // https://github.com/alcionai/corso/issues/4772
//
// TODO(pandeyabs): Above issue is for messages.
// This is not a problem for posts but leaving it here for safety.
logger.Ctx(ctx). logger.Ctx(ctx).
With("attachment_id", ptr.Val(attachment.GetId()), With("attachment_id", ptr.Val(attachment.GetId()),
"attachment_type", ptr.Val(attachment.GetOdataType())). "attachment_type", ptr.Val(attachment.GetOdataType())).
@ -255,6 +488,9 @@ func FromJSON(ctx context.Context, body []byte) (string, error) {
} }
name := ptr.Val(attachment.GetName()) name := ptr.Val(attachment.GetName())
if len(name) == 0 {
name = "Unnamed"
}
contentID, err := attachment.GetBackingStore().Get("contentId") contentID, err := attachment.GetBackingStore().Get("contentId")
if err != nil { if err != nil {
@ -279,24 +515,8 @@ func FromJSON(ctx context.Context, body []byte) (string, error) {
} }
} }
switch data.(type) { // Note: Posts cannot be of type EventMessageResponse, EventMessage or
case *models.EventMessageResponse, *models.EventMessage: // CalendarSharingMessage. So we don't need to handle those cases here.
// We can't handle this as of now, not enough information
// TODO: Fetch event object from graph when fetching email
case *models.CalendarSharingMessage:
// TODO: Parse out calendar sharing message
// https://github.com/alcionai/corso/issues/5041
case *models.EventMessageRequest:
cal, err := getICalData(ctx, data)
if err != nil {
return "", clues.Wrap(err, "getting ical attachment")
}
if len(cal) > 0 {
email.AddAlternative(mail.TextCalendar, cal)
}
}
if err = email.GetError(); err != nil { if err = email.GetError(); err != nil {
return "", clues.WrapWC(ctx, err, "converting to eml") return "", clues.WrapWC(ctx, err, "converting to eml")
} }

View File

@ -18,6 +18,8 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/converters/eml/testdata" "github.com/alcionai/corso/src/internal/converters/eml/testdata"
"github.com/alcionai/corso/src/internal/converters/ics" "github.com/alcionai/corso/src/internal/converters/ics"
"github.com/alcionai/corso/src/internal/m365/collection/groups/metadata"
stub "github.com/alcionai/corso/src/internal/m365/service/groups/mock"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
) )
@ -135,6 +137,11 @@ func (suite *EMLUnitSuite) TestConvert_messageble_to_eml() {
} }
func (suite *EMLUnitSuite) TestConvert_edge_cases() { func (suite *EMLUnitSuite) TestConvert_edge_cases() {
bodies := []string{
testdata.EmailWithAttachments,
testdata.EmailWithinEmail,
}
tests := []struct { tests := []struct {
name string name string
transform func(models.Messageable) transform func(models.Messageable)
@ -160,35 +167,75 @@ func (suite *EMLUnitSuite) TestConvert_edge_cases() {
require.NoError(suite.T(), err, "setting attachment content") require.NoError(suite.T(), err, "setting attachment content")
}, },
}, },
{
name: "attachment without name",
transform: func(msg models.Messageable) {
attachments := msg.GetAttachments()
attachments[1].SetName(ptr.To(""))
// This test has to be run on a non inline attachment
// as inline attachments use contentID instead of name
// even when there is a name.
assert.False(suite.T(), ptr.Val(attachments[1].GetIsInline()))
},
},
{
name: "attachment with nil name",
transform: func(msg models.Messageable) {
attachments := msg.GetAttachments()
attachments[1].SetName(nil)
// This test has to be run on a non inline attachment
// as inline attachments use contentID instead of name
// even when there is a name.
assert.False(suite.T(), ptr.Val(attachments[1].GetIsInline()))
},
},
{
name: "multiple attachments without name",
transform: func(msg models.Messageable) {
attachments := msg.GetAttachments()
attachments[1].SetName(ptr.To(""))
attachments[2].SetName(ptr.To(""))
// This test has to be run on a non inline attachment
// as inline attachments use contentID instead of name
// even when there is a name.
assert.False(suite.T(), ptr.Val(attachments[1].GetIsInline()))
assert.False(suite.T(), ptr.Val(attachments[2].GetIsInline()))
},
},
} }
for _, test := range tests { for _, b := range bodies {
suite.Run(test.name, func() { for _, test := range tests {
t := suite.T() suite.Run(test.name, func() {
t := suite.T()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
body := []byte(testdata.EmailWithAttachments) body := []byte(b)
msg, err := api.BytesToMessageable(body) msg, err := api.BytesToMessageable(body)
require.NoError(t, err, "creating message") require.NoError(t, err, "creating message")
test.transform(msg) test.transform(msg)
writer := kjson.NewJsonSerializationWriter() writer := kjson.NewJsonSerializationWriter()
defer writer.Close() defer writer.Close()
err = writer.WriteObjectValue("", msg) err = writer.WriteObjectValue("", msg)
require.NoError(t, err, "serializing message") require.NoError(t, err, "serializing message")
nbody, err := writer.GetSerializedContent() nbody, err := writer.GetSerializedContent()
require.NoError(t, err, "getting serialized content") require.NoError(t, err, "getting serialized content")
_, err = FromJSON(ctx, nbody) _, err = FromJSON(ctx, nbody)
assert.NoError(t, err, "converting to eml") assert.NoError(t, err, "converting to eml")
}) })
}
} }
} }
@ -226,11 +273,11 @@ func (suite *EMLUnitSuite) TestConvert_eml_ics() {
assert.Equal( assert.Equal(
t, t,
msg.GetCreatedDateTime().Format(ics.ICalDateTimeFormat), msg.GetCreatedDateTime().Format(ics.ICalDateTimeFormatUTC),
event.GetProperty(ical.ComponentPropertyCreated).Value) event.GetProperty(ical.ComponentPropertyCreated).Value)
assert.Equal( assert.Equal(
t, t,
msg.GetLastModifiedDateTime().Format(ics.ICalDateTimeFormat), msg.GetLastModifiedDateTime().Format(ics.ICalDateTimeFormatUTC),
event.GetProperty(ical.ComponentPropertyLastModified).Value) event.GetProperty(ical.ComponentPropertyLastModified).Value)
st, err := ics.GetUTCTime( st, err := ics.GetUTCTime(
@ -245,11 +292,11 @@ func (suite *EMLUnitSuite) TestConvert_eml_ics() {
assert.Equal( assert.Equal(
t, t,
st.Format(ics.ICalDateTimeFormat), st.Format(ics.ICalDateTimeFormatUTC),
event.GetProperty(ical.ComponentPropertyDtStart).Value) event.GetProperty(ical.ComponentPropertyDtStart).Value)
assert.Equal( assert.Equal(
t, t,
et.Format(ics.ICalDateTimeFormat), et.Format(ics.ICalDateTimeFormatUTC),
event.GetProperty(ical.ComponentPropertyDtEnd).Value) event.GetProperty(ical.ComponentPropertyDtEnd).Value)
tos := msg.GetToRecipients() tos := msg.GetToRecipients()
@ -325,3 +372,119 @@ func (suite *EMLUnitSuite) TestConvert_eml_ics_from_event_obj() {
assert.NotEqual(t, ptr.Val(msg.GetSubject()), event.GetProperty(ical.ComponentPropertySummary).Value) assert.NotEqual(t, ptr.Val(msg.GetSubject()), event.GetProperty(ical.ComponentPropertySummary).Value)
assert.Equal(t, ptr.Val(evt.GetSubject()), event.GetProperty(ical.ComponentPropertySummary).Value) assert.Equal(t, ptr.Val(evt.GetSubject()), event.GetProperty(ical.ComponentPropertySummary).Value)
} }
//-------------------------------------------------------------
// Postable -> EML tests
//-------------------------------------------------------------
func (suite *EMLUnitSuite) TestConvert_postable_to_eml() {
t := suite.T()
ctx, flush := tester.NewContext(t)
defer flush()
body := []byte(stub.PostWithAttachments)
postMetadata := metadata.ConversationPostMetadata{
Recipients: []string{"group@example.com"},
Topic: "test subject",
}
out, err := FromJSONPostToEML(ctx, body, postMetadata)
assert.NoError(t, err, "converting to eml")
post, err := api.BytesToPostable(body)
require.NoError(t, err, "creating post")
eml, err := enmime.ReadEnvelope(strings.NewReader(out))
require.NoError(t, err, "reading created eml")
assert.Equal(t, postMetadata.Topic, eml.GetHeader("Subject"))
assert.Equal(t, post.GetCreatedDateTime().Format(time.RFC1123Z), eml.GetHeader("Date"))
assert.Equal(t, formatAddress(post.GetFrom().GetEmailAddress()), eml.GetHeader("From"))
// Test recipients. The post metadata should contain the group email address.
tos := strings.Split(eml.GetHeader("To"), ", ")
for _, sourceTo := range postMetadata.Recipients {
assert.Contains(t, tos, sourceTo)
}
// Assert cc, bcc to be empty since they are not supported for posts right now.
assert.Equal(t, "", eml.GetHeader("Cc"))
assert.Equal(t, "", eml.GetHeader("Bcc"))
// Test attachments using PostWithAttachments data as a reference.
// This data has 1 direct attachment and 1 inline attachment.
assert.Equal(t, 1, len(eml.Attachments), "direct attachment count")
assert.Equal(t, 1, len(eml.Inlines), "inline attachment count")
for _, sourceAttachment := range post.GetAttachments() {
targetContent := eml.Attachments[0].Content
if ptr.Val(sourceAttachment.GetIsInline()) {
targetContent = eml.Inlines[0].Content
}
sourceContent, err := sourceAttachment.GetBackingStore().Get("contentBytes")
assert.NoError(t, err, "getting source attachment content")
assert.Equal(t, sourceContent, targetContent)
}
// Test body
source := strings.ReplaceAll(eml.HTML, "\n", "")
target := strings.ReplaceAll(ptr.Val(post.GetBody().GetContent()), "\n", "")
// replace the cid with a constant value to make the comparison
re := regexp.MustCompile(`(?:src|originalSrc)="cid:[^"]*"`)
source = re.ReplaceAllString(source, `src="cid:replaced"`)
target = re.ReplaceAllString(target, `src="cid:replaced"`)
assert.Equal(t, source, target)
}
// Tests an ics within an eml within another eml
func (suite *EMLUnitSuite) TestConvert_message_in_messageble_to_eml() {
t := suite.T()
ctx, flush := tester.NewContext(t)
defer flush()
body := []byte(testdata.EmailWithinEmail)
out, err := FromJSON(ctx, body)
assert.NoError(t, err, "converting to eml")
msg, err := api.BytesToMessageable(body)
require.NoError(t, err, "creating message")
eml, err := enmime.ReadEnvelope(strings.NewReader(out))
require.NoError(t, err, "reading created eml")
assert.Equal(t, ptr.Val(msg.GetSubject()), eml.GetHeader("Subject"))
assert.Equal(t, msg.GetSentDateTime().Format(time.RFC1123Z), eml.GetHeader("Date"))
assert.Equal(t, formatAddress(msg.GetFrom().GetEmailAddress()), eml.GetHeader("From"))
attachments := eml.Attachments
assert.Equal(t, 3, len(attachments), "attachment count in parent email")
ieml, err := enmime.ReadEnvelope(strings.NewReader(string(attachments[0].Content)))
require.NoError(t, err, "reading created eml")
itm, err := msg.GetAttachments()[0].GetBackingStore().Get("item")
require.NoError(t, err, "getting item from message")
imsg := itm.(*models.Message)
assert.Equal(t, ptr.Val(imsg.GetSubject()), ieml.GetHeader("Subject"))
assert.Equal(t, imsg.GetSentDateTime().Format(time.RFC1123Z), ieml.GetHeader("Date"))
assert.Equal(t, formatAddress(imsg.GetFrom().GetEmailAddress()), ieml.GetHeader("From"))
iattachments := ieml.Attachments
assert.Equal(t, 1, len(iattachments), "attachment count in child email")
// Known from testdata
assert.Contains(t, string(iattachments[0].Content), "X-LIC-LOCATION:Africa/Abidjan")
}

View File

@ -104,6 +104,19 @@
"contentId": null, "contentId": null,
"contentLocation": null, "contentLocation": null,
"contentBytes": "W1BhdGhzXQpQcmVmaXggPSAuLgo=" "contentBytes": "W1BhdGhzXQpQcmVmaXggPSAuLgo="
},
{
"@odata.type": "#microsoft.graph.fileAttachment",
"@odata.mediaContentType": "application/octet-stream",
"id": "ZZMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAEwbDEWAAABEgAQAD3rU0iyzCdHgz0xmOrWc9g=",
"lastModifiedDateTime": "2023-11-16T05:42:47Z",
"name": "qt2.conf",
"contentType": "application/octet-stream",
"size": 156,
"isInline": false,
"contentId": null,
"contentLocation": null,
"contentBytes": "Z1BhdGhzXQpQcmVmaXggPSAuLgo="
} }
] ]
} }

View File

@ -0,0 +1,268 @@
{
"id": "AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV-qAAA=",
"@odata.type": "#microsoft.graph.message",
"@odata.context": "https://graph.microsoft.com/v1.0/$metadata#users('7ceb8e03-bdc5-4509-a136-457526165ec0')/messages/$entity",
"@odata.etag": "W/\"CQAAABYAAABBFDg0JJk7TY1fmsJrh7tNAAFnDeBl\"",
"categories": [],
"changeKey": "CQAAABYAAABBFDg0JJk7TY1fmsJrh7tNAAFnDeBl",
"createdDateTime": "2024-02-05T09:33:23Z",
"lastModifiedDateTime": "2024-02-05T09:33:48Z",
"attachments": [
{
"id": "AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV-qAAABEgAQAEUyH0VS3HJBgHDlZdWZl0k=",
"@odata.type": "#microsoft.graph.itemAttachment",
"item@odata.navigationLink": "https://graph.microsoft.com/v1.0/users('7ceb8e03-bdc5-4509-a136-457526165ec0')/messages('')",
"item@odata.associationLink": "https://graph.microsoft.com/v1.0/users('7ceb8e03-bdc5-4509-a136-457526165ec0')/messages('')/$ref",
"isInline": false,
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"name": "Purpose of life",
"size": 11840,
"item": {
"id": "",
"@odata.type": "#microsoft.graph.message",
"createdDateTime": "2024-02-05T09:33:24Z",
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"attachments": [
{
"id": "AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV-qAAACEgAQAEUyH0VS3HJBgHDlZdWZl0kSABAAjBhd4-oQaUS969pTkS-gzA==",
"@odata.type": "#microsoft.graph.fileAttachment",
"@odata.mediaContentType": "text/calendar",
"contentType": "text/calendar",
"isInline": false,
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"name": "Abidjan.ics",
"size": 573,
"contentBytes": "QkVHSU46VkNBTEVOREFSDQpQUk9ESUQ6LS8vdHp1cmwub3JnLy9OT05TR01MIE9sc29uIDIwMjNkLy9FTg0KVkVSU0lPTjoyLjANCkJFR0lOOlZUSU1FWk9ORQ0KVFpJRDpBZnJpY2EvQWJpZGphbg0KTEFTVC1NT0RJRklFRDoyMDIzMTIyMlQyMzMzNThaDQpUWlVSTDpodHRwczovL3d3dy50enVybC5vcmcvem9uZWluZm8vQWZyaWNhL0FiaWRqYW4NClgtTElDLUxPQ0FUSU9OOkFmcmljYS9BYmlkamFuDQpYLVBST0xFUFRJQy1UWk5BTUU6TE1UDQpCRUdJTjpTVEFOREFSRA0KVFpOQU1FOkdNVA0KVFpPRkZTRVRGUk9NOi0wMDE2MDgNClRaT0ZGU0VUVE86KzAwMDANCkRUU1RBUlQ6MTkxMjAxMDFUMDAwMDAwDQpFTkQ6U1RBTkRBUkQNCkVORDpWVElNRVpPTkUNCkVORDpWQ0FMRU5EQVINCg=="
}
],
"body": {
"content": "<html><head>\r\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\"><style type=\"text/css\" style=\"display:none;\"> P {margin-top:0;margin-bottom:0;} </style></head><body dir=\"ltr\"><div class=\"elementToProof\" style=\"font-family: Aptos, Aptos_EmbeddedFont, Aptos_MSFontService, Calibri, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);\">I just realized the purpose of my life is to be a test case. Good to know.<br></div></body></html>",
"contentType": "html"
},
"bodyPreview": "I just realized the purpose of my life is to be a test case. Good to know.",
"conversationId": "AAQkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNAAQAFEnxDqYmbJEm8d2l3qfS6A=",
"conversationIndex": "AQHaWBYiUSfEOpiZskSbx3aXep9LoA==",
"flag": {
"flagStatus": "notFlagged"
},
"from": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"hasAttachments": true,
"importance": "normal",
"internetMessageId": "<SJ0PR04MB7294108E381BCCE5C207B6DEBC472@SJ0PR04MB7294.namprd04.prod.outlook.com>",
"isDeliveryReceiptRequested": false,
"isDraft": false,
"isRead": true,
"isReadReceiptRequested": false,
"receivedDateTime": "2024-02-05T09:33:12Z",
"sender": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"sentDateTime": "2024-02-05T09:33:11Z",
"subject": "Purpose of life",
"toRecipients": [
{
"emailAddress": {
"address": "PradeepG@10rqc2.onmicrosoft.com",
"name": "Pradeep Gupta"
}
}
],
"webLink": "https://outlook.office365.com/owa/?AttachmentItemID=AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV%2FqAAABEgAQAEUyH0VS3HJBgHDlZdWZl0k%3D&exvsurl=1&viewmodel=ItemAttachment"
}
},
{
"id": "AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV-qAAABEgAQAEUyH0VS3HJBgHDlZdWZl02=",
"@odata.type": "#microsoft.graph.itemAttachment",
"item@odata.navigationLink": "https://graph.microsoft.com/v1.0/users('7ceb8e03-bdc5-4509-a136-457526165ec0')/messages('')",
"item@odata.associationLink": "https://graph.microsoft.com/v1.0/users('7ceb8e03-bdc5-4509-a136-457526165ec0')/messages('')/$ref",
"isInline": false,
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"name": "Purpose of life part 2",
"size": 11840,
"item": {
"id": "",
"@odata.type": "#microsoft.graph.message",
"createdDateTime": "2024-02-05T09:33:24Z",
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"attachments": [
{
"id": "AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV-qAAACEgAQAEUyH0VS3HJBgHDlZdWZl0kSABAAjBhd4-oQaUS969pTkS-gzA==",
"@odata.type": "#microsoft.graph.fileAttachment",
"@odata.mediaContentType": "text/calendar",
"contentType": "text/calendar",
"isInline": false,
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"name": "Abidjan.ics",
"size": 573,
"contentBytes": "QkVHSU46VkNBTEVOREFSDQpQUk9ESUQ6LS8vdHp1cmwub3JnLy9OT05TR01MIE9sc29uIDIwMjNkLy9FTg0KVkVSU0lPTjoyLjANCkJFR0lOOlZUSU1FWk9ORQ0KVFpJRDpBZnJpY2EvQWJpZGphbg0KTEFTVC1NT0RJRklFRDoyMDIzMTIyMlQyMzMzNThaDQpUWlVSTDpodHRwczovL3d3dy50enVybC5vcmcvem9uZWluZm8vQWZyaWNhL0FiaWRqYW4NClgtTElDLUxPQ0FUSU9OOkFmcmljYS9BYmlkamFuDQpYLVBST0xFUFRJQy1UWk5BTUU6TE1UDQpCRUdJTjpTVEFOREFSRA0KVFpOQU1FOkdNVA0KVFpPRkZTRVRGUk9NOi0wMDE2MDgNClRaT0ZGU0VUVE86KzAwMDANCkRUU1RBUlQ6MTkxMjAxMDFUMDAwMDAwDQpFTkQ6U1RBTkRBUkQNCkVORDpWVElNRVpPTkUNCkVORDpWQ0FMRU5EQVINCg=="
}
],
"body": {
"content": "<html><head>\r\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\"><style type=\"text/css\" style=\"display:none;\"> P {margin-top:0;margin-bottom:0;} </style></head><body dir=\"ltr\"><div class=\"elementToProof\" style=\"font-family: Aptos, Aptos_EmbeddedFont, Aptos_MSFontService, Calibri, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);\">I just realized the purpose of my life is to be a test case. Good to know.<br></div></body></html>",
"contentType": "html"
},
"bodyPreview": "I just realized the purpose of my life is to be a test case. Good to know.",
"conversationId": "AAQkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNAAQAFEnxDqYmbJEm8d2l3qfS6A=",
"conversationIndex": "AQHaWBYiUSfEOpiZskSbx3aXep9LoA==",
"flag": {
"flagStatus": "notFlagged"
},
"from": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"hasAttachments": true,
"importance": "normal",
"internetMessageId": "<SJ0PR04MB7294108E381BCCE5C207B6DEBC472@SJ0PR04MB7294.namprd04.prod.outlook.com>",
"isDeliveryReceiptRequested": false,
"isDraft": false,
"isRead": true,
"isReadReceiptRequested": false,
"receivedDateTime": "2024-02-05T09:33:12Z",
"sender": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"sentDateTime": "2024-02-05T09:33:11Z",
"subject": "Purpose of life",
"toRecipients": [
{
"emailAddress": {
"address": "PradeepG@10rqc2.onmicrosoft.com",
"name": "Pradeep Gupta"
}
}
],
"webLink": "https://outlook.office365.com/owa/?AttachmentItemID=AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV%2FqAAABEgAQAEUyH0VS3HJBgHDlZdWZl02%3D&exvsurl=1&viewmodel=ItemAttachment"
}
},
{
"id": "AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV-qAAABEgAQAEUyH0VS3HJBgHDlZdWZl03=",
"@odata.type": "#microsoft.graph.itemAttachment",
"item@odata.navigationLink": "https://graph.microsoft.com/v1.0/users('7ceb8e03-bdc5-4509-a136-457526165ec0')/messages('')",
"item@odata.associationLink": "https://graph.microsoft.com/v1.0/users('7ceb8e03-bdc5-4509-a136-457526165ec0')/messages('')/$ref",
"isInline": false,
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"name": "Purpose of life part 3",
"size": 11840,
"item": {
"id": "",
"@odata.type": "#microsoft.graph.message",
"createdDateTime": "2024-02-05T09:33:24Z",
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"attachments": [
{
"id": "AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV-qAAACEgAQAEUyH0VS3HJBgHDlZdWZl0kSABAAjBhd4-oQaUS969pTkS-gzA==",
"@odata.type": "#microsoft.graph.fileAttachment",
"@odata.mediaContentType": "text/calendar",
"contentType": "text/calendar",
"isInline": false,
"lastModifiedDateTime": "2024-02-05T09:33:46Z",
"name": "Abidjan.ics",
"size": 573,
"contentBytes": "QkVHSU46VkNBTEVOREFSDQpQUk9ESUQ6LS8vdHp1cmwub3JnLy9OT05TR01MIE9sc29uIDIwMjNkLy9FTg0KVkVSU0lPTjoyLjANCkJFR0lOOlZUSU1FWk9ORQ0KVFpJRDpBZnJpY2EvQWJpZGphbg0KTEFTVC1NT0RJRklFRDoyMDIzMTIyMlQyMzMzNThaDQpUWlVSTDpodHRwczovL3d3dy50enVybC5vcmcvem9uZWluZm8vQWZyaWNhL0FiaWRqYW4NClgtTElDLUxPQ0FUSU9OOkFmcmljYS9BYmlkamFuDQpYLVBST0xFUFRJQy1UWk5BTUU6TE1UDQpCRUdJTjpTVEFOREFSRA0KVFpOQU1FOkdNVA0KVFpPRkZTRVRGUk9NOi0wMDE2MDgNClRaT0ZGU0VUVE86KzAwMDANCkRUU1RBUlQ6MTkxMjAxMDFUMDAwMDAwDQpFTkQ6U1RBTkRBUkQNCkVORDpWVElNRVpPTkUNCkVORDpWQ0FMRU5EQVINCg=="
}
],
"body": {
"content": "<html><head>\r\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\"><style type=\"text/css\" style=\"display:none;\"> P {margin-top:0;margin-bottom:0;} </style></head><body dir=\"ltr\"><div class=\"elementToProof\" style=\"font-family: Aptos, Aptos_EmbeddedFont, Aptos_MSFontService, Calibri, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);\">I just realized the purpose of my life is to be a test case. Good to know.<br></div></body></html>",
"contentType": "html"
},
"bodyPreview": "I just realized the purpose of my life is to be a test case. Good to know.",
"conversationId": "AAQkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNAAQAFEnxDqYmbJEm8d2l3qfS6A=",
"conversationIndex": "AQHaWBYiUSfEOpiZskSbx3aXep9LoA==",
"flag": {
"flagStatus": "notFlagged"
},
"from": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"hasAttachments": true,
"importance": "normal",
"internetMessageId": "<SJ0PR04MB7294108E381BCCE5C207B6DEBC472@SJ0PR04MB7294.namprd04.prod.outlook.com>",
"isDeliveryReceiptRequested": false,
"isDraft": false,
"isRead": true,
"isReadReceiptRequested": false,
"receivedDateTime": "2024-02-05T09:33:12Z",
"sender": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"sentDateTime": "2024-02-05T09:33:11Z",
"subject": "Purpose of life",
"toRecipients": [
{
"emailAddress": {
"address": "PradeepG@10rqc2.onmicrosoft.com",
"name": "Pradeep Gupta"
}
}
],
"webLink": "https://outlook.office365.com/owa/?AttachmentItemID=AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV%2FqAAABEgAQAEUyH0VS3HJBgHDlZdWZl03%3D&exvsurl=1&viewmodel=ItemAttachment"
}
}
],
"bccRecipients": [],
"body": {
"content": "<html><head>\r\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\"><style type=\"text/css\" style=\"display:none\">\r\n<!--\r\np\r\n\t{margin-top:0;\r\n\tmargin-bottom:0}\r\n-->\r\n</style></head><body dir=\"ltr\"><div><span class=\"elementToProof\" style=\"font-family:Aptos,Aptos_EmbeddedFont,Aptos_MSFontService,Calibri,Helvetica,sans-serif; font-size:12pt; color:rgb(0,0,0)\">Now, this is what we call nesting in this business.<br></span></div></body></html>",
"contentType": "html"
},
"bodyPreview": "Now, this is what we call nesting in this business.",
"ccRecipients": [],
"conversationId": "AAQkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNAAQAIv2-4RHwDhJhlqBV5PTE3Y=",
"conversationIndex": "AQHaWBZdi/b/hEfAOEmGWoFXk9MTdg==",
"flag": {
"flagStatus": "notFlagged"
},
"from": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"hasAttachments": true,
"importance": "normal",
"inferenceClassification": "focused",
"internetMessageId": "<SJ0PR04MB729409CE8C191E01151C110DBC472@SJ0PR04MB7294.namprd04.prod.outlook.com>",
"isDeliveryReceiptRequested": false,
"isDraft": false,
"isRead": true,
"isReadReceiptRequested": false,
"parentFolderId": "AQMkAGJiAGZhNjRlOC00OGI5LTQyNTItYjFkMy00NTJjMTgyZGZkMjQALgAAA0V2IruiJ9ZFvgAO6qBJFycBAEEUODQkmTtNjV_awmuHu00AAAIBCQAAAA==",
"receivedDateTime": "2024-02-05T09:33:46Z",
"replyTo": [],
"sender": {
"emailAddress": {
"address": "JohannaL@10rqc2.onmicrosoft.com",
"name": "Johanna Lorenz"
}
},
"sentDateTime": "2024-02-05T09:33:45Z",
"subject": "Fw: Purpose of life",
"toRecipients": [
{
"emailAddress": {
"address": "PradeepG@10rqc2.onmicrosoft.com",
"name": "Pradeep Gupta"
}
}
],
"webLink": "https://outlook.office365.com/owa/?ItemID=AAMkAGJiZmE2NGU4LTQ4YjktNDI1Mi1iMWQzLTQ1MmMxODJkZmQyNABGAAAAAABFdiK7oifWRb4ADuqgSRcnBwBBFDg0JJk7TY1fmsJrh7tNAAAAAAEJAABBFDg0JJk7TY1fmsJrh7tNAAFnbV%2FqAAA%3D&exvsurl=1&viewmodel=ReadMessageItem"
}

View File

@ -10,3 +10,6 @@ var EmailWithEventInfo string
//go:embed email-with-event-object.json //go:embed email-with-event-object.json
var EmailWithEventObject string var EmailWithEventObject string
//go:embed email-within-email.json
var EmailWithinEmail string

View File

@ -166,3 +166,20 @@ var GraphTimeZoneToTZ = map[string]string{
"Yukon Standard Time": "America/Whitehorse", "Yukon Standard Time": "America/Whitehorse",
"tzone://Microsoft/Utc": "Etc/UTC", "tzone://Microsoft/Utc": "Etc/UTC",
} }
// Map from alternatives to the canonical time zone name
// There mapping are currently generated by manually going on the
// values in the GraphTimeZoneToTZ which is not available in the tzdb
var CanonicalTimeZoneMap = map[string]string{
"Africa/Asmara": "Africa/Asmera",
"Asia/Calcutta": "Asia/Kolkata",
"Asia/Rangoon": "Asia/Yangon",
"Asia/Saigon": "Asia/Ho_Chi_Minh",
"Europe/Kiev": "Europe/Kyiv",
"Europe/Warsaw": "Europe/Warszawa",
"America/Buenos_Aires": "America/Argentina/Buenos_Aires",
"America/Godthab": "America/Nuuk",
// NOTE: "Atlantic/Raykjavik" missing in tzdb but is in MS list
"Etc/UTC": "UTC", // simplifying the time zone name
}

View File

@ -5,6 +5,7 @@ import (
"encoding/base64" "encoding/base64"
"encoding/json" "encoding/json"
"fmt" "fmt"
"net/mail"
"strings" "strings"
"time" "time"
"unicode" "unicode"
@ -16,6 +17,7 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/common/str" "github.com/alcionai/corso/src/internal/common/str"
"github.com/alcionai/corso/src/internal/converters/ics/tzdata"
"github.com/alcionai/corso/src/pkg/dttm" "github.com/alcionai/corso/src/pkg/dttm"
"github.com/alcionai/corso/src/pkg/logger" "github.com/alcionai/corso/src/pkg/logger"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
@ -31,8 +33,9 @@ import (
// TODO locations: https://github.com/alcionai/corso/issues/5003 // TODO locations: https://github.com/alcionai/corso/issues/5003
const ( const (
ICalDateTimeFormat = "20060102T150405Z" ICalDateTimeFormat = "20060102T150405"
ICalDateFormat = "20060102" ICalDateTimeFormatUTC = "20060102T150405Z"
ICalDateFormat = "20060102"
) )
func keyValues(key, value string) *ics.KeyValues { func keyValues(key, value string) *ics.KeyValues {
@ -172,6 +175,17 @@ func getRecurrencePattern(
recurComponents = append(recurComponents, "BYDAY="+prefix+strings.Join(dowComponents, ",")) recurComponents = append(recurComponents, "BYDAY="+prefix+strings.Join(dowComponents, ","))
} }
// This is necessary to compute when weekly events recur
fdow := pat.GetFirstDayOfWeek()
if fdow != nil {
icalday, ok := GraphToICalDOW[fdow.String()]
if !ok {
return "", clues.NewWC(ctx, "unknown first day of week").With("day", fdow)
}
recurComponents = append(recurComponents, "WKST="+icalday)
}
rrange := recurrence.GetRangeEscaped() rrange := recurrence.GetRangeEscaped()
if rrange != nil { if rrange != nil {
switch ptr.Val(rrange.GetTypeEscaped()) { switch ptr.Val(rrange.GetTypeEscaped()) {
@ -195,7 +209,7 @@ func getRecurrencePattern(
return "", clues.WrapWC(ctx, err, "parsing end time") return "", clues.WrapWC(ctx, err, "parsing end time")
} }
recurComponents = append(recurComponents, "UNTIL="+endTime.Format(ICalDateTimeFormat)) recurComponents = append(recurComponents, "UNTIL="+endTime.Format(ICalDateTimeFormatUTC))
} }
case models.NOEND_RECURRENCERANGETYPE: case models.NOEND_RECURRENCERANGETYPE:
// Nothing to do // Nothing to do
@ -224,10 +238,15 @@ func FromEventable(ctx context.Context, event models.Eventable) (string, error)
cal := ics.NewCalendar() cal := ics.NewCalendar()
cal.SetProductId("-//Alcion//Corso") // Does this have to be customizable? cal.SetProductId("-//Alcion//Corso") // Does this have to be customizable?
err := addTimeZoneComponents(ctx, cal, event)
if err != nil {
return "", clues.Wrap(err, "adding timezone components")
}
id := ptr.Val(event.GetId()) id := ptr.Val(event.GetId())
iCalEvent := cal.AddEvent(id) iCalEvent := cal.AddEvent(id)
err := updateEventProperties(ctx, event, iCalEvent) err = updateEventProperties(ctx, event, iCalEvent)
if err != nil { if err != nil {
return "", clues.Wrap(err, "updating event properties") return "", clues.Wrap(err, "updating event properties")
} }
@ -258,7 +277,7 @@ func FromEventable(ctx context.Context, event models.Eventable) (string, error)
exICalEvent := cal.AddEvent(id) exICalEvent := cal.AddEvent(id)
start := exception.GetOriginalStart() // will always be in UTC start := exception.GetOriginalStart() // will always be in UTC
exICalEvent.AddProperty(ics.ComponentProperty(ics.PropertyRecurrenceId), start.Format(ICalDateTimeFormat)) exICalEvent.AddProperty(ics.ComponentProperty(ics.PropertyRecurrenceId), start.Format(ICalDateTimeFormatUTC))
err = updateEventProperties(ctx, exception, exICalEvent) err = updateEventProperties(ctx, exception, exICalEvent)
if err != nil { if err != nil {
@ -269,6 +288,91 @@ func FromEventable(ctx context.Context, event models.Eventable) (string, error)
return cal.Serialize(), nil return cal.Serialize(), nil
} }
func getTZDataKeyValues(ctx context.Context, timezone string) (map[string]string, error) {
template, ok := tzdata.TZData[timezone]
if !ok {
return nil, clues.NewWC(ctx, "timezone not found in tz database").
With("timezone", timezone)
}
keyValues := map[string]string{}
for _, line := range strings.Split(template, "\n") {
splits := strings.SplitN(line, ":", 2)
if len(splits) != 2 {
return nil, clues.NewWC(ctx, "invalid tzdata line").
With("line", line).
With("timezone", timezone)
}
keyValues[splits[0]] = splits[1]
}
return keyValues, nil
}
func addTimeZoneComponents(ctx context.Context, cal *ics.Calendar, event models.Eventable) error {
// Handling of timezone get a bit tricky when we have to deal with
// relative recurrence. The issue comes up when we set a recurrence
// to be something like "repeat every 3rd Tuesday". Tuesday in UTC
// and in IST will be different and so we cannot just always use UTC.
//
// The way this is solved is by using the timezone in the
// recurrence for start and end timezones as we have to use UTC
// for UNTIL(mostly).
// https://www.rfc-editor.org/rfc/rfc5545#section-3.3.10
timezone, err := getRecurrenceTimezone(ctx, event)
if err != nil {
return clues.Stack(err)
}
if timezone != time.UTC {
kvs, err := getTZDataKeyValues(ctx, timezone.String())
if err != nil {
return clues.Stack(err)
}
tz := cal.AddTimezone(timezone.String())
for k, v := range kvs {
tz.AddProperty(ics.ComponentProperty(k), v)
}
}
return nil
}
// getRecurrenceTimezone get the timezone specified by the recurrence
// in the calendar. It does a normalization pass where we always convert
// the timezone to the value in tzdb If we don't have a recurrence
// timezone, we don't have to use a specific timezone in the export and
// is safe to return UTC from this method.
func getRecurrenceTimezone(ctx context.Context, event models.Eventable) (*time.Location, error) {
if event.GetRecurrence() != nil {
timezone := ptr.Val(event.GetRecurrence().GetRangeEscaped().GetRecurrenceTimeZone())
ctz, ok := GraphTimeZoneToTZ[timezone]
if ok {
timezone = ctz
}
cannon, ok := CanonicalTimeZoneMap[timezone]
if ok {
timezone = cannon
}
loc, err := time.LoadLocation(timezone)
if err != nil {
return nil, clues.WrapWC(ctx, err, "unknown timezone").
With("timezone", timezone)
}
return loc, nil
}
return time.UTC, nil
}
func isASCII(s string) bool { func isASCII(s string) bool {
for _, c := range s { for _, c := range s {
if c > unicode.MaxASCII { if c > unicode.MaxASCII {
@ -279,6 +383,12 @@ func isASCII(s string) bool {
return true return true
} }
// Checks if a given string is a valid email address
func isEmail(em string) bool {
_, err := mail.ParseAddress(em)
return err == nil
}
func updateEventProperties(ctx context.Context, event models.Eventable, iCalEvent *ics.VEvent) error { func updateEventProperties(ctx context.Context, event models.Eventable, iCalEvent *ics.VEvent) error {
// CREATED - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.7.1 // CREATED - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.7.1
created := event.GetCreatedDateTime() created := event.GetCreatedDateTime()
@ -292,6 +402,11 @@ func updateEventProperties(ctx context.Context, event models.Eventable, iCalEven
iCalEvent.SetModifiedAt(ptr.Val(modified)) iCalEvent.SetModifiedAt(ptr.Val(modified))
} }
timezone, err := getRecurrenceTimezone(ctx, event)
if err != nil {
return err
}
// DTSTART - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.2.4 // DTSTART - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.2.4
allDay := ptr.Val(event.GetIsAllDay()) allDay := ptr.Val(event.GetIsAllDay())
startString := event.GetStart().GetDateTime() startString := event.GetStart().GetDateTime()
@ -303,11 +418,7 @@ func updateEventProperties(ctx context.Context, event models.Eventable, iCalEven
return clues.WrapWC(ctx, err, "parsing start time") return clues.WrapWC(ctx, err, "parsing start time")
} }
if allDay { addTime(iCalEvent, ics.ComponentPropertyDtStart, start, allDay, timezone)
iCalEvent.SetStartAt(start, ics.WithValue(string(ics.ValueDataTypeDate)))
} else {
iCalEvent.SetStartAt(start)
}
} }
// DTEND - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.2.2 // DTEND - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.2.2
@ -320,11 +431,7 @@ func updateEventProperties(ctx context.Context, event models.Eventable, iCalEven
return clues.WrapWC(ctx, err, "parsing end time") return clues.WrapWC(ctx, err, "parsing end time")
} }
if allDay { addTime(iCalEvent, ics.ComponentPropertyDtEnd, end, allDay, timezone)
iCalEvent.SetEndAt(end, ics.WithValue(string(ics.ValueDataTypeDate)))
} else {
iCalEvent.SetEndAt(end)
}
} }
recurrence := event.GetRecurrence() recurrence := event.GetRecurrence()
@ -339,7 +446,7 @@ func updateEventProperties(ctx context.Context, event models.Eventable, iCalEven
// STATUS - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.1.11 // STATUS - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.1.11
cancelled := event.GetIsCancelled() cancelled := event.GetIsCancelled()
if cancelled != nil { if cancelled != nil && ptr.Val(cancelled) {
iCalEvent.SetStatus(ics.ObjectStatusCancelled) iCalEvent.SetStatus(ics.ObjectStatusCancelled)
} }
@ -377,7 +484,14 @@ func updateEventProperties(ctx context.Context, event models.Eventable, iCalEven
desc := replacer.Replace(description) desc := replacer.Replace(description)
iCalEvent.AddProperty("X-ALT-DESC", desc, ics.WithFmtType("text/html")) iCalEvent.AddProperty("X-ALT-DESC", desc, ics.WithFmtType("text/html"))
} else { } else {
stripped, err := html2text.FromString(description, html2text.Options{PrettyTables: true}) // Disable auto wrap, causes huge memory spikes
// https://github.com/jaytaylor/html2text/issues/48
prettyTablesOptions := html2text.NewPrettyTablesOptions()
prettyTablesOptions.AutoWrapText = false
stripped, err := html2text.FromString(
description,
html2text.Options{PrettyTables: true, PrettyTablesOptions: prettyTablesOptions})
if err != nil { if err != nil {
return clues.Wrap(err, "converting html to text"). return clues.Wrap(err, "converting html to text").
With("description_length", len(description)) With("description_length", len(description))
@ -481,8 +595,21 @@ func updateEventProperties(ctx context.Context, event models.Eventable, iCalEven
} }
} }
// It is possible that we get non email items like the below
// one which is an internal representation of the user in the
// Exchange system. While we can technically output this as an
// attendee, it is not useful plus other downstream tools like
// ones to use PST can choke on this.
// /o=ExchangeLabs/ou=ExchangeAdministrative Group(FY...LT)/cn=Recipients/cn=883...4a-John Doe
addr := ptr.Val(attendee.GetEmailAddress().GetAddress()) addr := ptr.Val(attendee.GetEmailAddress().GetAddress())
iCalEvent.AddAttendee(addr, props...) if isEmail(addr) {
iCalEvent.AddAttendee(addr, props...)
} else {
logger.Ctx(ctx).
With("attendee_email", addr).
With("attendee_name", name).
Info("skipping non email attendee from ics export")
}
} }
// LOCATION - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.1.7 // LOCATION - https://www.rfc-editor.org/rfc/rfc5545#section-3.8.1.7
@ -610,6 +737,26 @@ func updateEventProperties(ctx context.Context, event models.Eventable, iCalEven
return nil return nil
} }
func addTime(iCalEvent *ics.VEvent, prop ics.ComponentProperty, tm time.Time, allDay bool, tzLoc *time.Location) {
if allDay {
if tzLoc == time.UTC {
iCalEvent.SetProperty(prop, tm.Format(ICalDateFormat), ics.WithValue(string(ics.ValueDataTypeDate)))
} else {
iCalEvent.SetProperty(
prop,
tm.In(tzLoc).Format(ICalDateFormat),
ics.WithValue(string(ics.ValueDataTypeDate)),
keyValues("TZID", tzLoc.String()))
}
} else {
if tzLoc == time.UTC {
iCalEvent.SetProperty(prop, tm.Format(ICalDateTimeFormatUTC))
} else {
iCalEvent.SetProperty(prop, tm.In(tzLoc).Format(ICalDateTimeFormat), keyValues("TZID", tzLoc.String()))
}
}
}
func getCancelledDates(ctx context.Context, event models.Eventable) ([]time.Time, error) { func getCancelledDates(ctx context.Context, event models.Eventable) ([]time.Time, error) {
dateStrings, err := api.GetCancelledEventDateStrings(event) dateStrings, err := api.GetCancelledEventDateStrings(event)
if err != nil { if err != nil {

View File

@ -13,6 +13,7 @@ import (
"testing" "testing"
"time" "time"
ics "github.com/arran4/golang-ical"
"github.com/microsoft/kiota-abstractions-go/serialization" "github.com/microsoft/kiota-abstractions-go/serialization"
kjson "github.com/microsoft/kiota-serialization-json-go" kjson "github.com/microsoft/kiota-serialization-json-go"
"github.com/microsoftgraph/msgraph-sdk-go/models" "github.com/microsoftgraph/msgraph-sdk-go/models"
@ -21,6 +22,7 @@ import (
"github.com/stretchr/testify/suite" "github.com/stretchr/testify/suite"
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/converters/ics/tzdata"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
) )
@ -32,7 +34,7 @@ func TestICSUnitSuite(t *testing.T) {
suite.Run(t, &ICSUnitSuite{Suite: tester.NewUnitSuite(t)}) suite.Run(t, &ICSUnitSuite{Suite: tester.NewUnitSuite(t)})
} }
func (suite *ICSUnitSuite) TestGetLocationString() { func (s *ICSUnitSuite) TestGetLocationString() {
table := []struct { table := []struct {
name string name string
loc func() models.Locationable loc func() models.Locationable
@ -110,13 +112,13 @@ func (suite *ICSUnitSuite) TestGetLocationString() {
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
assert.Equal(suite.T(), tt.expect, getLocationString(tt.loc())) assert.Equal(s.T(), tt.expect, getLocationString(tt.loc()))
}) })
} }
} }
func (suite *ICSUnitSuite) TestGetUTCTime() { func (s *ICSUnitSuite) TestGetUTCTime() {
table := []struct { table := []struct {
name string name string
timestamp string timestamp string
@ -162,18 +164,18 @@ func (suite *ICSUnitSuite) TestGetUTCTime() {
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
t, err := GetUTCTime(tt.timestamp, tt.timezone) t, err := GetUTCTime(tt.timestamp, tt.timezone)
tt.errCheck(suite.T(), err) tt.errCheck(s.T(), err)
if !tt.time.Equal(time.Time{}) { if !tt.time.Equal(time.Time{}) {
assert.Equal(suite.T(), tt.time, t) assert.Equal(s.T(), tt.time, t)
} }
}) })
} }
} }
func (suite *ICSUnitSuite) TestGetRecurrencePattern() { func (s *ICSUnitSuite) TestGetRecurrencePattern() {
table := []struct { table := []struct {
name string name string
recurrence func() models.PatternedRecurrenceable recurrence func() models.PatternedRecurrenceable
@ -187,16 +189,37 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("daily") typ, err := models.ParseRecurrencePatternType("daily")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
rec.SetPattern(pat) rec.SetPattern(pat)
return rec return rec
}, },
expect: "FREQ=DAILY;INTERVAL=1", expect: "FREQ=DAILY;INTERVAL=1;WKST=SU",
errCheck: require.NoError,
},
{
name: "daily different start of week",
recurrence: func() models.PatternedRecurrenceable {
rec := models.NewPatternedRecurrence()
pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("daily")
require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.MONDAY_DAYOFWEEK))
rec.SetPattern(pat)
return rec
},
expect: "FREQ=DAILY;INTERVAL=1;WKST=MO",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -206,15 +229,16 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("daily") typ, err := models.ParseRecurrencePatternType("daily")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
rng := models.NewRecurrenceRange() rng := models.NewRecurrenceRange()
rrtype, err := models.ParseRecurrenceRangeType("endDate") rrtype, err := models.ParseRecurrenceRangeType("endDate")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
rng.SetTypeEscaped(rrtype.(*models.RecurrenceRangeType)) rng.SetTypeEscaped(rrtype.(*models.RecurrenceRangeType))
@ -227,7 +251,7 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
return rec return rec
}, },
expect: "FREQ=DAILY;INTERVAL=1;UNTIL=20210101T182959Z", expect: "FREQ=DAILY;INTERVAL=1;WKST=SU;UNTIL=20210101T182959Z",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -237,16 +261,17 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("weekly") typ, err := models.ParseRecurrencePatternType("weekly")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
rec.SetPattern(pat) rec.SetPattern(pat)
return rec return rec
}, },
expect: "FREQ=WEEKLY;INTERVAL=1", expect: "FREQ=WEEKLY;INTERVAL=1;WKST=SU",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -256,15 +281,16 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("weekly") typ, err := models.ParseRecurrencePatternType("weekly")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
rng := models.NewRecurrenceRange() rng := models.NewRecurrenceRange()
rrtype, err := models.ParseRecurrenceRangeType("endDate") rrtype, err := models.ParseRecurrenceRangeType("endDate")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
rng.SetTypeEscaped(rrtype.(*models.RecurrenceRangeType)) rng.SetTypeEscaped(rrtype.(*models.RecurrenceRangeType))
@ -277,7 +303,7 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
return rec return rec
}, },
expect: "FREQ=WEEKLY;INTERVAL=1;UNTIL=20210101T235959Z", expect: "FREQ=WEEKLY;INTERVAL=1;WKST=SU;UNTIL=20210101T235959Z",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -287,15 +313,16 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("weekly") typ, err := models.ParseRecurrencePatternType("weekly")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
rng := models.NewRecurrenceRange() rng := models.NewRecurrenceRange()
rrtype, err := models.ParseRecurrenceRangeType("numbered") rrtype, err := models.ParseRecurrenceRangeType("numbered")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
rng.SetTypeEscaped(rrtype.(*models.RecurrenceRangeType)) rng.SetTypeEscaped(rrtype.(*models.RecurrenceRangeType))
@ -307,7 +334,7 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
return rec return rec
}, },
expect: "FREQ=WEEKLY;INTERVAL=1;COUNT=10", expect: "FREQ=WEEKLY;INTERVAL=1;WKST=SU;COUNT=10",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -317,10 +344,11 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("weekly") typ, err := models.ParseRecurrencePatternType("weekly")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
days := []models.DayOfWeek{ days := []models.DayOfWeek{
models.MONDAY_DAYOFWEEK, models.MONDAY_DAYOFWEEK,
@ -334,7 +362,7 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
return rec return rec
}, },
expect: "FREQ=WEEKLY;INTERVAL=1;BYDAY=MO,WE,TH", expect: "FREQ=WEEKLY;INTERVAL=1;BYDAY=MO,WE,TH;WKST=SU",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -344,16 +372,17 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("daily") typ, err := models.ParseRecurrencePatternType("daily")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(2))) pat.SetInterval(ptr.To(int32(2)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
rec.SetPattern(pat) rec.SetPattern(pat)
return rec return rec
}, },
expect: "FREQ=DAILY;INTERVAL=2", expect: "FREQ=DAILY;INTERVAL=2;WKST=SU",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -363,10 +392,11 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("absoluteMonthly") typ, err := models.ParseRecurrencePatternType("absoluteMonthly")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
pat.SetDayOfMonth(ptr.To(int32(5))) pat.SetDayOfMonth(ptr.To(int32(5)))
@ -374,7 +404,7 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
return rec return rec
}, },
expect: "FREQ=MONTHLY;INTERVAL=1;BYMONTHDAY=5", expect: "FREQ=MONTHLY;INTERVAL=1;BYMONTHDAY=5;WKST=SU",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -384,10 +414,11 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("absoluteYearly") typ, err := models.ParseRecurrencePatternType("absoluteYearly")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(3))) pat.SetInterval(ptr.To(int32(3)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
pat.SetMonth(ptr.To(int32(8))) pat.SetMonth(ptr.To(int32(8)))
@ -395,7 +426,7 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
return rec return rec
}, },
expect: "FREQ=YEARLY;INTERVAL=3;BYMONTH=8", expect: "FREQ=YEARLY;INTERVAL=3;BYMONTH=8;WKST=SU",
errCheck: require.NoError, errCheck: require.NoError,
}, },
{ {
@ -405,37 +436,38 @@ func (suite *ICSUnitSuite) TestGetRecurrencePattern() {
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
typ, err := models.ParseRecurrencePatternType("relativeYearly") typ, err := models.ParseRecurrencePatternType("relativeYearly")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
pat.SetMonth(ptr.To(int32(8))) pat.SetMonth(ptr.To(int32(8)))
pat.SetDaysOfWeek([]models.DayOfWeek{models.FRIDAY_DAYOFWEEK}) pat.SetDaysOfWeek([]models.DayOfWeek{models.FRIDAY_DAYOFWEEK})
wi, err := models.ParseWeekIndex("first") wi, err := models.ParseWeekIndex("first")
require.NoError(suite.T(), err) require.NoError(s.T(), err)
pat.SetIndex(wi.(*models.WeekIndex)) pat.SetIndex(wi.(*models.WeekIndex))
rec.SetPattern(pat) rec.SetPattern(pat)
return rec return rec
}, },
expect: "FREQ=YEARLY;INTERVAL=1;BYMONTH=8;BYDAY=1FR", expect: "FREQ=YEARLY;INTERVAL=1;BYMONTH=8;BYDAY=1FR;WKST=SU",
errCheck: require.NoError, errCheck: require.NoError,
}, },
// TODO(meain): could still use more tests for edge cases of time // TODO(meain): could still use more tests for edge cases of time
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
ctx, flush := tester.NewContext(suite.T()) ctx, flush := tester.NewContext(s.T())
defer flush() defer flush()
rec, err := getRecurrencePattern(ctx, tt.recurrence()) rec, err := getRecurrencePattern(ctx, tt.recurrence())
tt.errCheck(suite.T(), err) tt.errCheck(s.T(), err)
assert.Equal(suite.T(), tt.expect, rec) assert.Equal(s.T(), tt.expect, rec)
}) })
} }
} }
@ -460,8 +492,8 @@ func baseEvent() *models.Event {
return e return e
} }
func (suite *ICSUnitSuite) TestEventConversion() { func (s *ICSUnitSuite) TestEventConversion() {
t := suite.T() t := s.T()
table := []struct { table := []struct {
name string name string
@ -546,14 +578,19 @@ func (suite *ICSUnitSuite) TestEventConversion() {
rec := models.NewPatternedRecurrence() rec := models.NewPatternedRecurrence()
pat := models.NewRecurrencePattern() pat := models.NewRecurrencePattern()
rng := models.NewRecurrenceRange()
typ, err := models.ParseRecurrencePatternType("daily") typ, err := models.ParseRecurrencePatternType("daily")
require.NoError(t, err) require.NoError(t, err)
pat.SetTypeEscaped(typ.(*models.RecurrencePatternType)) pat.SetTypeEscaped(typ.(*models.RecurrencePatternType))
pat.SetInterval(ptr.To(int32(1))) pat.SetInterval(ptr.To(int32(1)))
pat.SetFirstDayOfWeek(ptr.To(models.SUNDAY_DAYOFWEEK))
rng.SetRecurrenceTimeZone(ptr.To("UTC"))
rec.SetPattern(pat) rec.SetPattern(pat)
rec.SetRangeEscaped(rng)
e.SetRecurrence(rec) e.SetRecurrence(rec)
@ -576,6 +613,19 @@ func (suite *ICSUnitSuite) TestEventConversion() {
assert.Contains(t, out, "STATUS:CANCELLED", "cancelled status") assert.Contains(t, out, "STATUS:CANCELLED", "cancelled status")
}, },
}, },
{
name: "not cancelled event",
event: func() *models.Event {
e := baseEvent()
e.SetIsCancelled(ptr.To(false))
return e
},
check: func(out string) {
assert.NotContains(t, out, "STATUS:CANCELLED", "cancelled status")
},
},
{ {
name: "text body", name: "text body",
event: func() *models.Event { event: func() *models.Event {
@ -817,8 +867,8 @@ func (suite *ICSUnitSuite) TestEventConversion() {
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
t := suite.T() t := s.T()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
@ -868,8 +918,8 @@ func checkAttendee(t *testing.T, out, check, msg string) {
assert.ElementsMatch(t, as, bs, fmt.Sprintf("fields %s", msg)) assert.ElementsMatch(t, as, bs, fmt.Sprintf("fields %s", msg))
} }
func (suite *ICSUnitSuite) TestAttendees() { func (s *ICSUnitSuite) TestAttendees() {
t := suite.T() t := s.T()
table := []struct { table := []struct {
name string name string
@ -895,6 +945,17 @@ func (suite *ICSUnitSuite) TestAttendees() {
"attendee") "attendee")
}, },
}, },
{
name: "attendee with internal exchange representation for email",
att: [][]string{{
"/o=ExchangeLabs/ou=ExchangeAdministrative Group(FY...LT)/cn=Recipients/cn=883...4a-John Doe",
"required",
"declined",
}},
check: func(out string) {
assert.NotContains(t, out, "ATTENDEE")
},
},
{ {
name: "multiple attendees", name: "multiple attendees",
att: [][]string{ att: [][]string{
@ -925,8 +986,8 @@ func (suite *ICSUnitSuite) TestAttendees() {
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
t := suite.T() t := s.T()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
@ -1047,8 +1108,8 @@ func checkAttachment(t *testing.T, out, check, msg string) {
assert.ElementsMatch(t, as, bs, fmt.Sprintf("fields %s", msg)) assert.ElementsMatch(t, as, bs, fmt.Sprintf("fields %s", msg))
} }
func (suite *ICSUnitSuite) TestAttachments() { func (s *ICSUnitSuite) TestAttachments() {
t := suite.T() t := s.T()
type attachment struct { type attachment struct {
cid string // contentid cid string // contentid
@ -1104,8 +1165,8 @@ func (suite *ICSUnitSuite) TestAttachments() {
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
t := suite.T() t := s.T()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
@ -1148,7 +1209,7 @@ func (suite *ICSUnitSuite) TestAttachments() {
} }
} }
func (suite *ICSUnitSuite) TestCancellations() { func (s *ICSUnitSuite) TestCancellations() {
table := []struct { table := []struct {
name string name string
cancelledIds []string cancelledIds []string
@ -1172,8 +1233,8 @@ func (suite *ICSUnitSuite) TestCancellations() {
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
t := suite.T() t := s.T()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
@ -1236,7 +1297,7 @@ func eventToJSON(e *models.Event) ([]byte, error) {
return bts, err return bts, err
} }
func (suite *ICSUnitSuite) TestEventExceptions() { func (s *ICSUnitSuite) TestEventExceptions() {
table := []struct { table := []struct {
name string name string
event func() *models.Event event func() *models.Event
@ -1258,7 +1319,7 @@ func (suite *ICSUnitSuite) TestEventExceptions() {
exception.SetEnd(newEnd) exception.SetEnd(newEnd)
parsed, err := eventToMap(exception) parsed, err := eventToMap(exception)
require.NoError(suite.T(), err, "parsing exception") require.NoError(s.T(), err, "parsing exception")
// add exception event to additional data // add exception event to additional data
e.SetAdditionalData(map[string]any{ e.SetAdditionalData(map[string]any{
@ -1277,15 +1338,15 @@ func (suite *ICSUnitSuite) TestEventExceptions() {
} }
} }
assert.Equal(suite.T(), 2, events, "number of events") assert.Equal(s.T(), 2, events, "number of events")
assert.Contains(suite.T(), out, "RECURRENCE-ID:20210101T120000Z", "recurrence id") assert.Contains(s.T(), out, "RECURRENCE-ID:20210101T120000Z", "recurrence id")
assert.Contains(suite.T(), out, "SUMMARY:Subject", "original event") assert.Contains(s.T(), out, "SUMMARY:Subject", "original event")
assert.Contains(suite.T(), out, "SUMMARY:Exception", "exception event") assert.Contains(s.T(), out, "SUMMARY:Exception", "exception event")
assert.Contains(suite.T(), out, "DTSTART:20210101T130000Z", "new start time") assert.Contains(s.T(), out, "DTSTART:20210101T130000Z", "new start time")
assert.Contains(suite.T(), out, "DTEND:20210101T140000Z", "new end time") assert.Contains(s.T(), out, "DTEND:20210101T140000Z", "new end time")
}, },
}, },
{ {
@ -1314,10 +1375,10 @@ func (suite *ICSUnitSuite) TestEventExceptions() {
exception2.SetEnd(newEnd) exception2.SetEnd(newEnd)
parsed1, err := eventToMap(exception1) parsed1, err := eventToMap(exception1)
require.NoError(suite.T(), err, "parsing exception 1") require.NoError(s.T(), err, "parsing exception 1")
parsed2, err := eventToMap(exception2) parsed2, err := eventToMap(exception2)
require.NoError(suite.T(), err, "parsing exception 2") require.NoError(s.T(), err, "parsing exception 2")
// add exception event to additional data // add exception event to additional data
e.SetAdditionalData(map[string]any{ e.SetAdditionalData(map[string]any{
@ -1336,36 +1397,230 @@ func (suite *ICSUnitSuite) TestEventExceptions() {
} }
} }
assert.Equal(suite.T(), 3, events, "number of events") assert.Equal(s.T(), 3, events, "number of events")
assert.Contains(suite.T(), out, "RECURRENCE-ID:20210101T120000Z", "recurrence id 1") assert.Contains(s.T(), out, "RECURRENCE-ID:20210101T120000Z", "recurrence id 1")
assert.Contains(suite.T(), out, "RECURRENCE-ID:20210102T120000Z", "recurrence id 2") assert.Contains(s.T(), out, "RECURRENCE-ID:20210102T120000Z", "recurrence id 2")
assert.Contains(suite.T(), out, "SUMMARY:Subject", "original event") assert.Contains(s.T(), out, "SUMMARY:Subject", "original event")
assert.Contains(suite.T(), out, "SUMMARY:Exception 1", "exception event 1") assert.Contains(s.T(), out, "SUMMARY:Exception 1", "exception event 1")
assert.Contains(suite.T(), out, "SUMMARY:Exception 2", "exception event 2") assert.Contains(s.T(), out, "SUMMARY:Exception 2", "exception event 2")
assert.Contains(suite.T(), out, "DTSTART:20210101T130000Z", "new start time 1") assert.Contains(s.T(), out, "DTSTART:20210101T130000Z", "new start time 1")
assert.Contains(suite.T(), out, "DTEND:20210101T140000Z", "new end time 1") assert.Contains(s.T(), out, "DTEND:20210101T140000Z", "new end time 1")
assert.Contains(suite.T(), out, "DTSTART:20210102T130000Z", "new start time 2") assert.Contains(s.T(), out, "DTSTART:20210102T130000Z", "new start time 2")
assert.Contains(suite.T(), out, "DTEND:20210102T140000Z", "new end time 2") assert.Contains(s.T(), out, "DTEND:20210102T140000Z", "new end time 2")
}, },
}, },
} }
for _, tt := range table { for _, tt := range table {
suite.Run(tt.name, func() { s.Run(tt.name, func() {
ctx, flush := tester.NewContext(suite.T()) ctx, flush := tester.NewContext(s.T())
defer flush() defer flush()
bts, err := eventToJSON(tt.event()) bts, err := eventToJSON(tt.event())
require.NoError(suite.T(), err, "getting serialized content") require.NoError(s.T(), err, "getting serialized content")
out, err := FromJSON(ctx, bts) out, err := FromJSON(ctx, bts)
require.NoError(suite.T(), err, "converting to ics") require.NoError(s.T(), err, "converting to ics")
tt.check(out) tt.check(out)
}) })
} }
} }
func (s *ICSUnitSuite) TestGetRecurrenceTimezone() {
table := []struct {
name string
intz string
outtz string
}{
{
name: "empty",
intz: "",
outtz: "UTC",
},
{
name: "utc",
intz: "UTC",
outtz: "UTC",
},
{
name: "simple",
intz: "Asia/Kolkata",
outtz: "Asia/Kolkata",
},
{
name: "windows tz",
intz: "India Standard Time",
outtz: "Asia/Kolkata",
},
{
name: "non canonical",
intz: "Asia/Calcutta",
outtz: "Asia/Kolkata",
},
}
for _, tt := range table {
s.Run(tt.name, func() {
ctx, flush := tester.NewContext(s.T())
defer flush()
event := baseEvent()
if len(tt.intz) > 0 {
recur := models.NewPatternedRecurrence()
rp := models.NewRecurrenceRange()
rp.SetRecurrenceTimeZone(ptr.To(tt.intz))
recur.SetRangeEscaped(rp)
event.SetRecurrence(recur)
}
timezone, err := getRecurrenceTimezone(ctx, event)
require.NoError(s.T(), err)
assert.Equal(s.T(), tt.outtz, timezone.String())
})
}
}
func (s *ICSUnitSuite) TestAddTimezoneComponents() {
event := baseEvent()
recur := models.NewPatternedRecurrence()
rp := models.NewRecurrenceRange()
rp.SetRecurrenceTimeZone(ptr.To("Asia/Kolkata"))
recur.SetRangeEscaped(rp)
event.SetRecurrence(recur)
ctx, flush := tester.NewContext(s.T())
defer flush()
cal := ics.NewCalendar()
err := addTimeZoneComponents(ctx, cal, event)
require.NoError(s.T(), err)
text := cal.Serialize()
assert.Contains(s.T(), text, "BEGIN:VTIMEZONE", "beginning of timezone")
assert.Contains(s.T(), text, "TZID:Asia/Kolkata", "timezone id")
assert.Contains(s.T(), text, "END:VTIMEZONE", "end of timezone")
}
func (s *ICSUnitSuite) TestAddTime() {
locak, err := time.LoadLocation("Asia/Kolkata")
require.NoError(s.T(), err)
table := []struct {
name string
prop ics.ComponentProperty
time time.Time
allDay bool
loc *time.Location
exp string
}{
{
name: "utc",
prop: ics.ComponentPropertyDtStart,
time: time.Date(2021, 1, 2, 3, 4, 5, 0, time.UTC),
allDay: false,
loc: time.UTC,
exp: "DTSTART:20210102T030405Z",
},
{
name: "local",
prop: ics.ComponentPropertyDtStart,
time: time.Date(2021, 1, 2, 3, 4, 5, 0, time.UTC),
allDay: false,
loc: locak,
exp: "DTSTART;TZID=Asia/Kolkata:20210102T083405",
},
{
name: "all day",
prop: ics.ComponentPropertyDtStart,
time: time.Date(2021, 1, 2, 0, 0, 0, 0, time.UTC),
allDay: true,
loc: time.UTC,
exp: "DTSTART;VALUE=DATE:20210102",
},
{
name: "all day local",
prop: ics.ComponentPropertyDtStart,
time: time.Date(2021, 1, 2, 0, 0, 0, 0, time.UTC),
allDay: true,
loc: locak,
exp: "DTSTART;VALUE=DATE;TZID=Asia/Kolkata:20210102",
},
{
name: "end",
prop: ics.ComponentPropertyDtEnd,
time: time.Date(2021, 1, 2, 3, 4, 5, 0, time.UTC),
allDay: false,
loc: time.UTC,
exp: "DTEND:20210102T030405Z",
},
{
// This won't happen, but a good test to have to test loc handling
name: "windows tz",
prop: ics.ComponentPropertyDtStart,
time: time.Date(2021, 1, 2, 3, 4, 5, 0, time.UTC),
allDay: false,
loc: time.FixedZone("India Standard Time", 5*60*60+30*60),
exp: "DTSTART;TZID=India Standard Time:20210102T083405",
},
}
for _, tt := range table {
s.Run(tt.name, func() {
cal := ics.NewCalendar()
evt := cal.AddEvent("id")
addTime(evt, tt.prop, tt.time, tt.allDay, tt.loc)
expSplits := strings.FieldsFunc(tt.exp, func(c rune) bool {
return c == ':' || c == ';'
})
text := cal.Serialize()
checkLine := ""
for _, l := range strings.Split(text, "\r\n") {
if strings.HasPrefix(l, string(tt.prop)) {
checkLine = l
break
}
}
actSplits := strings.FieldsFunc(checkLine, func(c rune) bool {
return c == ':' || c == ';'
})
assert.Greater(s.T(), len(checkLine), 0, "line not found")
assert.Equal(s.T(), len(expSplits), len(actSplits), "length of fields")
assert.ElementsMatch(s.T(), expSplits, actSplits, "fields")
})
}
}
// This tests and ensures that the generated data is int he format
// that we expect
func (s *ICSUnitSuite) TestGetTZDataKeyValues() {
for key := range tzdata.TZData {
s.Run(key, func() {
ctx, flush := tester.NewContext(s.T())
defer flush()
data, err := getTZDataKeyValues(ctx, key)
require.NoError(s.T(), err)
assert.NotEmpty(s.T(), data, "data")
assert.NotContains(s.T(), data, "BEGIN", "beginning of timezone") // should be stripped
assert.NotContains(s.T(), data, "END", "end of timezone") // should be stripped
assert.NotContains(s.T(), data, "TZID", "timezone id") // should be stripped
assert.Contains(s.T(), data, "DTSTART", "start time")
assert.Contains(s.T(), data, "TZOFFSETFROM", "offset from")
})
}
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,35 @@
#!/bin/sh
set -eo pipefail
if ! echo "$PWD" | grep -q '/tzdata$'; then
echo "Please run this script from the tzdata dir"
exit 1
fi
# TODO: Generate from https://www.iana.org/time-zones
if [ ! -d /tmp/corso-tzdata ]; then
git clone --depth 1 https://github.com/add2cal/timezones-ical-library.git /tmp/corso-tzdata
else
cd /tmp/corso-tzdata
git pull
cd -
fi
# Generate a huge go file with all the timezones
echo "package tzdata" >data.go
echo "" >>data.go
echo "var TZData = map[string]string{" >>data.go
find /tmp/corso-tzdata/ -name '*.ics' | while read -r f; do
tz=$(echo "$f" | sed 's|/tmp/corso-tzdata/api/||;s|\.ics$||')
echo "Processing $tz"
printf "\t\"%s\": \`" "$tz" >>data.go
cat "$f" | grep -Ev "(BEGIN:|END:|TZID:)" |
sed 's|`|\\`|g;s|\r||;s|TZID:/timezones-ical-library/|TZID:|' |
perl -pe 'chomp if eof' >>data.go
echo "\`," >>data.go
done
echo "}" >>data.go

View File

@ -86,7 +86,7 @@ func FromJSON(ctx context.Context, body []byte) (string, error) {
data, err := api.BytesToContactable(body) data, err := api.BytesToContactable(body)
if err != nil { if err != nil {
return "", clues.Wrap(err, "converting to contactable"). return "", clues.WrapWC(ctx, err, "converting to contactable").
With("body_length", len(body)) With("body_length", len(body))
} }

View File

@ -4,6 +4,7 @@ import (
"context" "context"
"fmt" "fmt"
"path/filepath" "path/filepath"
"reflect"
"sync" "sync"
"time" "time"
@ -24,11 +25,14 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/kopia/retention" "github.com/alcionai/corso/src/internal/kopia/retention"
"github.com/alcionai/corso/src/pkg/control/repository" "github.com/alcionai/corso/src/pkg/control/repository"
"github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/logger" "github.com/alcionai/corso/src/pkg/logger"
"github.com/alcionai/corso/src/pkg/storage" "github.com/alcionai/corso/src/pkg/storage"
) )
const ( const (
corsoWrapperAlertNamespace = "corso-kopia-wrapper"
defaultKopiaConfigDir = "/tmp/" defaultKopiaConfigDir = "/tmp/"
kopiaConfigFileTemplate = "repository-%s.config" kopiaConfigFileTemplate = "repository-%s.config"
defaultCompressor = "zstd-better-compression" defaultCompressor = "zstd-better-compression"
@ -55,6 +59,15 @@ const (
minEpochDurationUpperBound = 7 * 24 * time.Hour minEpochDurationUpperBound = 7 * 24 * time.Hour
) )
// allValidCompressors is the set of compression algorithms either currently
// being used or that were previously used. Use this during the config verify
// command to avoid spurious errors. We can revisit whether we want to update
// the config in those old repos at a later time.
var allValidCompressors = map[compression.Name]struct{}{
compression.Name(defaultCompressor): {},
compression.Name("s2-default"): {},
}
var ( var (
ErrSettingDefaultConfig = clues.New("setting default repo config values") ErrSettingDefaultConfig = clues.New("setting default repo config values")
ErrorRepoAlreadyExists = clues.New("repo already exists") ErrorRepoAlreadyExists = clues.New("repo already exists")
@ -145,12 +158,16 @@ func (w *conn) Initialize(
RetentionPeriod: blobCfg.RetentionPeriod, RetentionPeriod: blobCfg.RetentionPeriod,
} }
var initErr error
if err = repo.Initialize(ctx, bst, &kopiaOpts, cfg.CorsoPassphrase); err != nil { if err = repo.Initialize(ctx, bst, &kopiaOpts, cfg.CorsoPassphrase); err != nil {
if errors.Is(err, repo.ErrAlreadyInitialized) { if !errors.Is(err, repo.ErrAlreadyInitialized) {
return clues.StackWC(ctx, ErrorRepoAlreadyExists, err) return clues.WrapWC(ctx, err, "initializing repo")
} }
return clues.WrapWC(ctx, err, "initializing repo") logger.Ctx(ctx).Info("repo already exists, verifying repo config")
initErr = clues.StackWC(ctx, ErrorRepoAlreadyExists, err)
} }
err = w.commonConnect( err = w.commonConnect(
@ -162,7 +179,10 @@ func (w *conn) Initialize(
cfg.CorsoPassphrase, cfg.CorsoPassphrase,
defaultCompressor) defaultCompressor)
if err != nil { if err != nil {
return err // If the repo already exists then give some indication to that to help the
// user debug. For example, they could have called init again on a repo that
// already exists but accidentally used a different passphrase.
return clues.Stack(err, initErr)
} }
if err := w.setDefaultConfigValues(ctx); err != nil { if err := w.setDefaultConfigValues(ctx); err != nil {
@ -736,3 +756,115 @@ func (w *conn) updatePersistentConfig(
"persisting updated config"). "persisting updated config").
OrNil() OrNil()
} }
func (w *conn) verifyDefaultPolicyConfigOptions(
ctx context.Context,
errs *fault.Bus,
) {
const alertName = "kopia-global-policy"
globalPol, err := w.getGlobalPolicyOrEmpty(ctx)
if err != nil {
errs.AddAlert(ctx, fault.NewAlert(
err.Error(),
corsoWrapperAlertNamespace,
"fetch-policy",
alertName,
nil))
return
}
ctx = clues.Add(ctx, "current_global_policy", globalPol.String())
if _, ok := allValidCompressors[globalPol.CompressionPolicy.CompressorName]; !ok {
errs.AddAlert(ctx, fault.NewAlert(
"unexpected compressor",
corsoWrapperAlertNamespace,
"compressor",
alertName,
nil))
}
// Need to use deep equals because the values are pointers to optional types.
// That makes regular equality checks fail even if the data contained in each
// policy is the same.
if !reflect.DeepEqual(globalPol.RetentionPolicy, defaultRetention) {
errs.AddAlert(ctx, fault.NewAlert(
"unexpected retention policy",
corsoWrapperAlertNamespace,
"retention-policy",
alertName,
nil))
}
if globalPol.SchedulingPolicy.Interval() != defaultSchedulingInterval {
errs.AddAlert(ctx, fault.NewAlert(
"unexpected scheduling interval",
corsoWrapperAlertNamespace,
"scheduling-interval",
alertName,
nil))
}
}
func (w *conn) verifyRetentionConfig(
ctx context.Context,
errs *fault.Bus,
) {
const alertName = "kopia-object-locking"
directRepo, ok := w.Repository.(repo.DirectRepository)
if !ok {
errs.AddAlert(ctx, fault.NewAlert(
"",
corsoWrapperAlertNamespace,
"fetch-direct-repo",
alertName,
nil))
return
}
blobConfig, maintenanceParams, err := getRetentionConfigs(ctx, directRepo)
if err != nil {
errs.AddAlert(ctx, fault.NewAlert(
err.Error(),
corsoWrapperAlertNamespace,
"fetch-config",
alertName,
nil))
return
}
err = retention.OptsFromConfigs(*blobConfig, *maintenanceParams).
Verify(ctx)
if err != nil {
errs.AddAlert(ctx, fault.NewAlert(
err.Error(),
corsoWrapperAlertNamespace,
"config-values",
alertName,
nil))
}
}
// verifyDefaultConfigOptions checks the following configurations:
// kopia global policy:
// - kopia snapshot retention is disabled
// - kopia compression matches the default compression for corso
// - kopia scheduling is disabled
//
// object locking:
// - maintenance and blob config blob parameters are consistent (i.e. all
// enabled or all disabled)
func (w *conn) verifyDefaultConfigOptions(
ctx context.Context,
errs *fault.Bus,
) {
logger.Ctx(ctx).Info("verifying config parameters")
w.verifyDefaultPolicyConfigOptions(ctx, errs)
w.verifyRetentionConfig(ctx, errs)
}

View File

@ -3,6 +3,7 @@ package kopia
import ( import (
"context" "context"
"math" "math"
"strings"
"testing" "testing"
"time" "time"
@ -15,11 +16,13 @@ import (
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
"github.com/stretchr/testify/suite" "github.com/stretchr/testify/suite"
"golang.org/x/exp/maps"
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
strTD "github.com/alcionai/corso/src/internal/common/str/testdata" strTD "github.com/alcionai/corso/src/internal/common/str/testdata"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/pkg/control/repository" "github.com/alcionai/corso/src/pkg/control/repository"
"github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/storage" "github.com/alcionai/corso/src/pkg/storage"
storeTD "github.com/alcionai/corso/src/pkg/storage/testdata" storeTD "github.com/alcionai/corso/src/pkg/storage/testdata"
) )
@ -93,7 +96,7 @@ func TestWrapperIntegrationSuite(t *testing.T) {
}) })
} }
func (suite *WrapperIntegrationSuite) TestRepoExistsError() { func (suite *WrapperIntegrationSuite) TestInitialize_SamePassphrase() {
t := suite.T() t := suite.T()
repoNameHash := strTD.NewHashForRepoConfigName() repoNameHash := strTD.NewHashForRepoConfigName()
@ -109,6 +112,46 @@ func (suite *WrapperIntegrationSuite) TestRepoExistsError() {
err = k.Close(ctx) err = k.Close(ctx)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
err = k.Initialize(ctx, repository.Options{}, repository.Retention{}, repoNameHash)
assert.NoError(t, err, clues.ToCore(err))
}
func (suite *WrapperIntegrationSuite) TestInitialize_IncorrectPassphrase() {
t := suite.T()
repoNameHash := strTD.NewHashForRepoConfigName()
ctx, flush := tester.NewContext(t)
defer flush()
st1 := storeTD.NewFilesystemStorage(t)
k := NewConn(st1)
err := k.Initialize(ctx, repository.Options{}, repository.Retention{}, repoNameHash)
require.NoError(t, err, clues.ToCore(err))
err = k.Close(ctx)
require.NoError(t, err, clues.ToCore(err))
// Hacky way to edit the existing passphrase for the repo so we can check that
// we get a sensible error back.
st2 := st1
st2.Config = maps.Clone(st1.Config)
var found bool
for k, v := range st2.Config {
if strings.Contains(strings.ToLower(k), "passphrase") {
st2.Config[k] = v + "1"
found = true
break
}
}
require.True(t, found, "unable to update passphrase for test")
k = NewConn(st2)
err = k.Initialize(ctx, repository.Options{}, repository.Retention{}, repoNameHash) err = k.Initialize(ctx, repository.Options{}, repository.Retention{}, repoNameHash)
assert.Error(t, err, clues.ToCore(err)) assert.Error(t, err, clues.ToCore(err))
assert.ErrorIs(t, err, ErrorRepoAlreadyExists) assert.ErrorIs(t, err, ErrorRepoAlreadyExists)
@ -779,3 +822,281 @@ func (suite *ConnRetentionIntegrationSuite) TestInitWithAndWithoutRetention() {
// Some checks to make sure retention was fully initialized as expected. // Some checks to make sure retention was fully initialized as expected.
checkRetentionParams(t, ctx, k2, blob.Governance, time.Hour*48, assert.True) checkRetentionParams(t, ctx, k2, blob.Governance, time.Hour*48, assert.True)
} }
// TestVerifyDefaultConfigOptions checks that if the repo has misconfigured
// values an error is returned. This is easiest to do in a test suite that
// allows object locking because some of the configured values that are checked
// relate to object locking.
func (suite *ConnRetentionIntegrationSuite) TestVerifyDefaultConfigOptions() {
nonzeroOpt := policy.OptionalInt(42)
table := []struct {
name string
setupRepo func(context.Context, *testing.T, *conn)
expectAlerts int
}{
{
name: "ValidConfigs NoRetention",
setupRepo: func(context.Context, *testing.T, *conn) {},
},
{
name: "ValidConfigs Retention",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
err := con.setRetentionParameters(
ctx,
repository.Retention{
Mode: ptr.To(repository.GovernanceRetention),
Duration: ptr.To(48 * time.Hour),
Extend: ptr.To(true),
})
require.NoError(t, err, clues.ToCore(err))
},
},
{
name: "ValidRetentionButNotExtending",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
err := con.setRetentionParameters(
ctx,
repository.Retention{
Mode: ptr.To(repository.GovernanceRetention),
Duration: ptr.To(48 * time.Hour),
Extend: ptr.To(false),
})
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "ExtendingRetentionButNotConfigured",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
err := con.setRetentionParameters(
ctx,
repository.Retention{
Extend: ptr.To(true),
})
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "NonZeroScheduleInterval",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateSchedulingOnPolicy(time.Hour, pol)
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "OldValidCompressor",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
_, err = updateCompressionOnPolicy("s2-default", pol)
require.NoError(t, err, clues.ToCore(err))
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 0,
},
{
name: "NonDefaultCompression",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
_, err = updateCompressionOnPolicy("pgzip-best-speed", pol)
require.NoError(t, err, clues.ToCore(err))
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "NonZeroSnapshotRetentionLatest",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
retention := policy.RetentionPolicy{
KeepLatest: &nonzeroOpt,
KeepHourly: &zeroOpt,
KeepWeekly: &zeroOpt,
KeepDaily: &zeroOpt,
KeepMonthly: &zeroOpt,
KeepAnnual: &zeroOpt,
}
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateRetentionOnPolicy(retention, pol)
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "NonZeroSnapshotRetentionHourly",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
retention := policy.RetentionPolicy{
KeepLatest: &zeroOpt,
KeepHourly: &nonzeroOpt,
KeepWeekly: &zeroOpt,
KeepDaily: &zeroOpt,
KeepMonthly: &zeroOpt,
KeepAnnual: &zeroOpt,
}
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateRetentionOnPolicy(retention, pol)
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "NonZeroSnapshotRetentionWeekly",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
retention := policy.RetentionPolicy{
KeepLatest: &zeroOpt,
KeepHourly: &zeroOpt,
KeepWeekly: &nonzeroOpt,
KeepDaily: &zeroOpt,
KeepMonthly: &zeroOpt,
KeepAnnual: &zeroOpt,
}
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateRetentionOnPolicy(retention, pol)
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "NonZeroSnapshotRetentionDaily",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
retention := policy.RetentionPolicy{
KeepLatest: &zeroOpt,
KeepHourly: &zeroOpt,
KeepWeekly: &zeroOpt,
KeepDaily: &nonzeroOpt,
KeepMonthly: &zeroOpt,
KeepAnnual: &zeroOpt,
}
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateRetentionOnPolicy(retention, pol)
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "NonZeroSnapshotRetentionMonthly",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
retention := policy.RetentionPolicy{
KeepLatest: &zeroOpt,
KeepHourly: &zeroOpt,
KeepWeekly: &zeroOpt,
KeepDaily: &zeroOpt,
KeepMonthly: &nonzeroOpt,
KeepAnnual: &zeroOpt,
}
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateRetentionOnPolicy(retention, pol)
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "NonZeroSnapshotRetentionAnnual",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
retention := policy.RetentionPolicy{
KeepLatest: &zeroOpt,
KeepHourly: &zeroOpt,
KeepWeekly: &zeroOpt,
KeepDaily: &zeroOpt,
KeepMonthly: &zeroOpt,
KeepAnnual: &nonzeroOpt,
}
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateRetentionOnPolicy(retention, pol)
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 1,
},
{
name: "MultipleAlerts",
setupRepo: func(ctx context.Context, t *testing.T, con *conn) {
err := con.setRetentionParameters(
ctx,
repository.Retention{
Mode: ptr.To(repository.GovernanceRetention),
Duration: ptr.To(48 * time.Hour),
Extend: ptr.To(false),
})
require.NoError(t, err, clues.ToCore(err))
pol, err := con.getGlobalPolicyOrEmpty(ctx)
require.NoError(t, err, clues.ToCore(err))
updateSchedulingOnPolicy(time.Hour, pol)
_, err = updateCompressionOnPolicy("pgzip-best-speed", pol)
require.NoError(t, err, clues.ToCore(err))
err = con.writeGlobalPolicy(ctx, "test", pol)
require.NoError(t, err, clues.ToCore(err))
},
expectAlerts: 3,
},
}
for _, test := range table {
suite.Run(test.name, func() {
t := suite.T()
ctx, flush := tester.NewContext(t)
t.Cleanup(flush)
repoNameHash := strTD.NewHashForRepoConfigName()
st1 := storeTD.NewPrefixedS3Storage(t)
con := NewConn(st1)
err := con.Initialize(ctx, repository.Options{}, repository.Retention{}, repoNameHash)
require.NoError(t, err, clues.ToCore(err))
t.Cleanup(func() { con.Close(ctx) })
test.setupRepo(ctx, t, con)
errs := fault.New(true)
con.verifyDefaultConfigOptions(ctx, errs)
// There shouldn't be any reported failures because this is just to check
// if things are alright.
assert.NoError(t, errs.Failure(), clues.ToCore(errs.Failure()))
assert.Len(t, errs.Alerts(), test.expectAlerts)
})
}
}

View File

@ -665,7 +665,12 @@ func (w Wrapper) RepoMaintenance(
ctx context.Context, ctx context.Context,
storer store.Storer, storer store.Storer,
opts repository.Maintenance, opts repository.Maintenance,
errs *fault.Bus,
) error { ) error {
// Check the existing config parameters first so that even if we fail for some
// reason below we know we checked the config.
w.c.verifyDefaultConfigOptions(ctx, errs)
kopiaSafety, err := translateSafety(opts.Safety) kopiaSafety, err := translateSafety(opts.Safety)
if err != nil { if err != nil {
return clues.WrapWC(ctx, err, "identifying safety level") return clues.WrapWC(ctx, err, "identifying safety level")
@ -696,8 +701,9 @@ func (w Wrapper) RepoMaintenance(
// Even if we fail this we don't want to fail the overall maintenance // Even if we fail this we don't want to fail the overall maintenance
// operation since there's other useful work we can still do. // operation since there's other useful work we can still do.
if err := cleanupOrphanedData(ctx, storer, w.c, buffer, time.Now); err != nil { if err := cleanupOrphanedData(ctx, storer, w.c, buffer, time.Now); err != nil {
logger.CtxErr(ctx, err).Info( errs.AddRecoverable(ctx, clues.Wrap(
"cleaning up failed backups, some space may not be freed") err,
"cleaning up failed backups, some space may not be freed"))
} }
} }

View File

@ -27,7 +27,6 @@ import (
strTD "github.com/alcionai/corso/src/internal/common/str/testdata" strTD "github.com/alcionai/corso/src/internal/common/str/testdata"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
dataMock "github.com/alcionai/corso/src/internal/data/mock" dataMock "github.com/alcionai/corso/src/internal/data/mock"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
exchMock "github.com/alcionai/corso/src/internal/m365/service/exchange/mock" exchMock "github.com/alcionai/corso/src/internal/m365/service/exchange/mock"
istats "github.com/alcionai/corso/src/internal/stats" istats "github.com/alcionai/corso/src/internal/stats"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
@ -38,6 +37,7 @@ import (
"github.com/alcionai/corso/src/pkg/fault" "github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/logger" "github.com/alcionai/corso/src/pkg/logger"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
storeTD "github.com/alcionai/corso/src/pkg/storage/testdata" storeTD "github.com/alcionai/corso/src/pkg/storage/testdata"
) )
@ -198,7 +198,7 @@ func (suite *BasicKopiaIntegrationSuite) TestMaintenance_FirstRun_NoChanges() {
Type: repository.MetadataMaintenance, Type: repository.MetadataMaintenance,
} }
err = w.RepoMaintenance(ctx, nil, opts) err = w.RepoMaintenance(ctx, nil, opts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
} }
@ -220,7 +220,7 @@ func (suite *BasicKopiaIntegrationSuite) TestMaintenance_WrongUser_NoForce_Fails
} }
// This will set the user. // This will set the user.
err = w.RepoMaintenance(ctx, nil, mOpts) err = w.RepoMaintenance(ctx, nil, mOpts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
err = k.Close(ctx) err = k.Close(ctx)
@ -236,7 +236,7 @@ func (suite *BasicKopiaIntegrationSuite) TestMaintenance_WrongUser_NoForce_Fails
var notOwnedErr maintenance.NotOwnedError var notOwnedErr maintenance.NotOwnedError
err = w.RepoMaintenance(ctx, nil, mOpts) err = w.RepoMaintenance(ctx, nil, mOpts, fault.New(true))
assert.ErrorAs(t, err, &notOwnedErr, clues.ToCore(err)) assert.ErrorAs(t, err, &notOwnedErr, clues.ToCore(err))
} }
@ -258,7 +258,7 @@ func (suite *BasicKopiaIntegrationSuite) TestMaintenance_WrongUser_Force_Succeed
} }
// This will set the user. // This will set the user.
err = w.RepoMaintenance(ctx, nil, mOpts) err = w.RepoMaintenance(ctx, nil, mOpts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
err = k.Close(ctx) err = k.Close(ctx)
@ -275,13 +275,13 @@ func (suite *BasicKopiaIntegrationSuite) TestMaintenance_WrongUser_Force_Succeed
mOpts.Force = true mOpts.Force = true
// This will set the user. // This will set the user.
err = w.RepoMaintenance(ctx, nil, mOpts) err = w.RepoMaintenance(ctx, nil, mOpts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
mOpts.Force = false mOpts.Force = false
// Running without force should succeed now. // Running without force should succeed now.
err = w.RepoMaintenance(ctx, nil, mOpts) err = w.RepoMaintenance(ctx, nil, mOpts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
} }
@ -733,7 +733,7 @@ func (suite *RetentionIntegrationSuite) TestSetRetentionParameters_And_Maintenan
// This will set common maintenance config parameters. There's some interplay // This will set common maintenance config parameters. There's some interplay
// between the maintenance schedule and retention period that we want to check // between the maintenance schedule and retention period that we want to check
// below. // below.
err = w.RepoMaintenance(ctx, nil, mOpts) err = w.RepoMaintenance(ctx, nil, mOpts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
// Enable retention. // Enable retention.
@ -838,7 +838,7 @@ func (suite *RetentionIntegrationSuite) TestSetAndUpdateRetentionParameters_RunM
// This will set common maintenance config parameters. There's some interplay // This will set common maintenance config parameters. There's some interplay
// between the maintenance schedule and retention period that we want to check // between the maintenance schedule and retention period that we want to check
// below. // below.
err = w.RepoMaintenance(ctx, ms, mOpts) err = w.RepoMaintenance(ctx, ms, mOpts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
// Enable retention. // Enable retention.
@ -882,7 +882,7 @@ func (suite *RetentionIntegrationSuite) TestSetAndUpdateRetentionParameters_RunM
// Run full maintenance again. This should extend object locks for things if // Run full maintenance again. This should extend object locks for things if
// they exist. // they exist.
err = w.RepoMaintenance(ctx, ms, mOpts) err = w.RepoMaintenance(ctx, ms, mOpts, fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
}) })
} }

View File

@ -2,6 +2,7 @@ package m365
import ( import (
"context" "context"
"fmt"
"github.com/alcionai/clues" "github.com/alcionai/clues"
@ -13,7 +14,10 @@ import (
"github.com/alcionai/corso/src/internal/m365/service/groups" "github.com/alcionai/corso/src/internal/m365/service/groups"
"github.com/alcionai/corso/src/internal/m365/service/onedrive" "github.com/alcionai/corso/src/internal/m365/service/onedrive"
"github.com/alcionai/corso/src/internal/m365/service/sharepoint" "github.com/alcionai/corso/src/internal/m365/service/sharepoint"
"github.com/alcionai/corso/src/internal/m365/service/teamschats"
"github.com/alcionai/corso/src/internal/m365/support"
"github.com/alcionai/corso/src/internal/operations/inject" "github.com/alcionai/corso/src/internal/operations/inject"
"github.com/alcionai/corso/src/pkg/account"
bupMD "github.com/alcionai/corso/src/pkg/backup/metadata" bupMD "github.com/alcionai/corso/src/pkg/backup/metadata"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/count" "github.com/alcionai/corso/src/pkg/count"
@ -22,9 +26,33 @@ import (
"github.com/alcionai/corso/src/pkg/filters" "github.com/alcionai/corso/src/pkg/filters"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/selectors" "github.com/alcionai/corso/src/pkg/selectors"
"github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
) )
type backupHandler interface {
produceBackupCollectionser
}
type produceBackupCollectionser interface {
ProduceBackupCollections(
ctx context.Context,
bpc inject.BackupProducerConfig,
ac api.Client,
creds account.M365Config,
su support.StatusUpdater,
counter *count.Bus,
errs *fault.Bus,
) (
collections []data.BackupCollection,
excludeItems *prefixmatcher.StringSetMatcher,
// canUsePreviousBacukp can be always returned true for impelementations
// that always return a tombstone collection when the metadata read fails
canUsePreviousBackup bool,
err error,
)
}
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
// Data Collections // Data Collections
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
@ -63,65 +91,38 @@ func (ctrl *Controller) ProduceBackupCollections(
canUsePreviousBackup bool canUsePreviousBackup bool
) )
var handler backupHandler
switch service { switch service {
case path.ExchangeService: case path.ExchangeService:
colls, excludeItems, canUsePreviousBackup, err = exchange.ProduceBackupCollections( handler = exchange.NewBackup()
ctx,
bpc,
ctrl.AC,
ctrl.credentials,
ctrl.UpdateStatus,
counter,
errs)
if err != nil {
return nil, nil, false, err
}
case path.OneDriveService: case path.OneDriveService:
colls, excludeItems, canUsePreviousBackup, err = onedrive.ProduceBackupCollections( handler = onedrive.NewBackup()
ctx,
bpc,
ctrl.AC,
ctrl.credentials,
ctrl.UpdateStatus,
counter,
errs)
if err != nil {
return nil, nil, false, err
}
case path.SharePointService: case path.SharePointService:
colls, excludeItems, canUsePreviousBackup, err = sharepoint.ProduceBackupCollections( handler = sharepoint.NewBackup()
ctx,
bpc,
ctrl.AC,
ctrl.credentials,
ctrl.UpdateStatus,
counter,
errs)
if err != nil {
return nil, nil, false, err
}
case path.GroupsService: case path.GroupsService:
colls, excludeItems, err = groups.ProduceBackupCollections( handler = groups.NewBackup()
ctx,
bpc,
ctrl.AC,
ctrl.credentials,
ctrl.UpdateStatus,
counter,
errs)
if err != nil {
return nil, nil, false, err
}
// canUsePreviousBacukp can be always returned true for groups as we case path.TeamsChatsService:
// return a tombstone collection in case the metadata read fails handler = teamschats.NewBackup()
canUsePreviousBackup = true
default: default:
return nil, nil, false, clues.Wrap(clues.NewWC(ctx, service.String()), "service not supported") return nil, nil, false, clues.NewWC(ctx, fmt.Sprintf("service not supported: %s", service.HumanString()))
}
colls, excludeItems, canUsePreviousBackup, err = handler.ProduceBackupCollections(
ctx,
bpc,
ctrl.AC,
ctrl.credentials,
ctrl.UpdateStatus,
counter,
errs)
if err != nil {
return nil, nil, false, err
} }
for _, c := range colls { for _, c := range colls {
@ -153,25 +154,28 @@ func (ctrl *Controller) IsServiceEnabled(
return sharepoint.IsServiceEnabled(ctx, ctrl.AC.Sites(), resourceOwner) return sharepoint.IsServiceEnabled(ctx, ctrl.AC.Sites(), resourceOwner)
case path.GroupsService: case path.GroupsService:
return groups.IsServiceEnabled(ctx, ctrl.AC.Groups(), resourceOwner) return groups.IsServiceEnabled(ctx, ctrl.AC.Groups(), resourceOwner)
case path.TeamsChatsService:
return teamschats.IsServiceEnabled(ctx, ctrl.AC.Users(), resourceOwner)
} }
return false, clues.Wrap(clues.NewWC(ctx, service.String()), "service not supported") return false, clues.Wrap(clues.NewWC(ctx, service.String()), "service not supported")
} }
func verifyBackupInputs(sels selectors.Selector, cachedIDs []string) error { func verifyBackupInputs(sel selectors.Selector, cachedIDs []string) error {
var ids []string var ids []string
switch sels.Service { switch sel.Service {
case selectors.ServiceExchange, selectors.ServiceOneDrive: case selectors.ServiceExchange, selectors.ServiceOneDrive:
// Exchange and OneDrive user existence now checked in checkServiceEnabled. // Exchange and OneDrive user existence now checked in checkServiceEnabled.
return nil return nil
case selectors.ServiceSharePoint, selectors.ServiceGroups: case selectors.ServiceSharePoint, selectors.ServiceGroups, selectors.ServiceTeamsChats:
ids = cachedIDs ids = cachedIDs
} }
if !filters.Contains(ids).Compare(sels.ID()) { if !filters.Contains(ids).Compare(sel.ID()) {
return clues.Stack(core.ErrNotFound).With("selector_protected_resource", sels.DiscreteOwner) return clues.Wrap(core.ErrNotFound, "verifying existence of resource").
With("selector_protected_resource", sel.ID())
} }
return nil return nil

View File

@ -11,7 +11,6 @@ import (
"github.com/stretchr/testify/suite" "github.com/stretchr/testify/suite"
inMock "github.com/alcionai/corso/src/internal/common/idname/mock" inMock "github.com/alcionai/corso/src/internal/common/idname/mock"
"github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
"github.com/alcionai/corso/src/internal/data/mock" "github.com/alcionai/corso/src/internal/data/mock"
"github.com/alcionai/corso/src/internal/m365/service/exchange" "github.com/alcionai/corso/src/internal/m365/service/exchange"
@ -19,6 +18,7 @@ import (
"github.com/alcionai/corso/src/internal/m365/service/sharepoint" "github.com/alcionai/corso/src/internal/m365/service/sharepoint"
"github.com/alcionai/corso/src/internal/operations/inject" "github.com/alcionai/corso/src/internal/operations/inject"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/internal/version" "github.com/alcionai/corso/src/internal/version"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
@ -36,10 +36,7 @@ import (
type DataCollectionIntgSuite struct { type DataCollectionIntgSuite struct {
tester.Suite tester.Suite
user string m365 its.M365IntgTestSetup
site string
tenantID string
ac api.Client
} }
func TestDataCollectionIntgSuite(t *testing.T) { func TestDataCollectionIntgSuite(t *testing.T) {
@ -51,29 +48,14 @@ func TestDataCollectionIntgSuite(t *testing.T) {
} }
func (suite *DataCollectionIntgSuite) SetupSuite() { func (suite *DataCollectionIntgSuite) SetupSuite() {
t := suite.T() suite.m365 = its.GetM365(suite.T())
suite.user = tconfig.M365UserID(t)
suite.site = tconfig.M365SiteID(t)
acct := tconfig.NewM365Account(t)
creds, err := acct.M365Config()
require.NoError(t, err, clues.ToCore(err))
suite.tenantID = creds.AzureTenantID
suite.ac, err = api.NewClient(
creds,
control.DefaultOptions(),
count.New())
require.NoError(t, err, clues.ToCore(err))
} }
func (suite *DataCollectionIntgSuite) TestExchangeDataCollection() { func (suite *DataCollectionIntgSuite) TestExchangeDataCollection() {
ctx, flush := tester.NewContext(suite.T()) ctx, flush := tester.NewContext(suite.T())
defer flush() defer flush()
selUsers := []string{suite.user} selUsers := []string{suite.m365.User.ID}
ctrl := newController(ctx, suite.T(), path.ExchangeService) ctrl := newController(ctx, suite.T(), path.ExchangeService)
tests := []struct { tests := []struct {
@ -85,7 +67,7 @@ func (suite *DataCollectionIntgSuite) TestExchangeDataCollection() {
getSelector: func(t *testing.T) selectors.Selector { getSelector: func(t *testing.T) selectors.Selector {
sel := selectors.NewExchangeBackup(selUsers) sel := selectors.NewExchangeBackup(selUsers)
sel.Include(sel.MailFolders([]string{api.MailInbox}, selectors.PrefixMatch())) sel.Include(sel.MailFolders([]string{api.MailInbox}, selectors.PrefixMatch()))
sel.DiscreteOwner = suite.user sel.DiscreteOwner = suite.m365.User.ID
return sel.Selector return sel.Selector
}, },
}, },
@ -94,7 +76,7 @@ func (suite *DataCollectionIntgSuite) TestExchangeDataCollection() {
getSelector: func(t *testing.T) selectors.Selector { getSelector: func(t *testing.T) selectors.Selector {
sel := selectors.NewExchangeBackup(selUsers) sel := selectors.NewExchangeBackup(selUsers)
sel.Include(sel.ContactFolders([]string{api.DefaultContacts}, selectors.PrefixMatch())) sel.Include(sel.ContactFolders([]string{api.DefaultContacts}, selectors.PrefixMatch()))
sel.DiscreteOwner = suite.user sel.DiscreteOwner = suite.m365.User.ID
return sel.Selector return sel.Selector
}, },
}, },
@ -139,11 +121,11 @@ func (suite *DataCollectionIntgSuite) TestExchangeDataCollection() {
Selector: sel, Selector: sel,
} }
collections, excludes, canUsePreviousBackup, err := exchange.ProduceBackupCollections( collections, excludes, canUsePreviousBackup, err := exchange.NewBackup().ProduceBackupCollections(
ctx, ctx,
bpc, bpc,
suite.ac, suite.m365.AC,
suite.ac.Credentials, suite.m365.Creds,
ctrl.UpdateStatus, ctrl.UpdateStatus,
count.New(), count.New(),
fault.New(true)) fault.New(true))
@ -270,7 +252,7 @@ func (suite *DataCollectionIntgSuite) TestSharePointDataCollection() {
ctx, flush := tester.NewContext(suite.T()) ctx, flush := tester.NewContext(suite.T())
defer flush() defer flush()
selSites := []string{suite.site} selSites := []string{suite.m365.Site.ID}
ctrl := newController(ctx, suite.T(), path.SharePointService) ctrl := newController(ctx, suite.T(), path.SharePointService)
tests := []struct { tests := []struct {
name string name string
@ -309,10 +291,10 @@ func (suite *DataCollectionIntgSuite) TestSharePointDataCollection() {
Selector: sel, Selector: sel,
} }
collections, excludes, canUsePreviousBackup, err := sharepoint.ProduceBackupCollections( collections, excludes, canUsePreviousBackup, err := sharepoint.NewBackup().ProduceBackupCollections(
ctx, ctx,
bpc, bpc,
suite.ac, suite.m365.AC,
ctrl.credentials, ctrl.credentials,
ctrl.UpdateStatus, ctrl.UpdateStatus,
count.New(), count.New(),
@ -351,8 +333,7 @@ func (suite *DataCollectionIntgSuite) TestSharePointDataCollection() {
type SPCollectionIntgSuite struct { type SPCollectionIntgSuite struct {
tester.Suite tester.Suite
connector *Controller m365 its.M365IntgTestSetup
user string
} }
func TestSPCollectionIntgSuite(t *testing.T) { func TestSPCollectionIntgSuite(t *testing.T) {
@ -364,13 +345,7 @@ func TestSPCollectionIntgSuite(t *testing.T) {
} }
func (suite *SPCollectionIntgSuite) SetupSuite() { func (suite *SPCollectionIntgSuite) SetupSuite() {
ctx, flush := tester.NewContext(suite.T()) suite.m365 = its.GetM365(suite.T())
defer flush()
suite.connector = newController(ctx, suite.T(), path.SharePointService)
suite.user = tconfig.M365UserID(suite.T())
tester.LogTimeOfTest(suite.T())
} }
func (suite *SPCollectionIntgSuite) TestCreateSharePointCollection_Libraries() { func (suite *SPCollectionIntgSuite) TestCreateSharePointCollection_Libraries() {
@ -379,25 +354,20 @@ func (suite *SPCollectionIntgSuite) TestCreateSharePointCollection_Libraries() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
var ( ctrl := newController(ctx, t, path.SharePointService)
siteID = tconfig.M365SiteID(t)
ctrl = newController(ctx, t, path.SharePointService)
siteIDs = []string{siteID}
)
site, err := ctrl.PopulateProtectedResourceIDAndName(ctx, siteID, nil) _, err := ctrl.PopulateProtectedResourceIDAndName(ctx, suite.m365.Site.ID, nil)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
sel := selectors.NewSharePointBackup(siteIDs) sel := selectors.NewSharePointBackup([]string{suite.m365.Site.ID})
sel.Include(sel.LibraryFolders([]string{"foo"}, selectors.PrefixMatch())) sel.Include(sel.LibraryFolders([]string{"foo"}, selectors.PrefixMatch()))
sel.Include(sel.Library("Documents")) sel.Include(sel.Library("Documents"))
sel.SetDiscreteOwnerIDName(suite.m365.Site.ID, suite.m365.Site.WebURL)
sel.SetDiscreteOwnerIDName(site.ID(), site.Name())
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: site, ProtectedResource: suite.m365.Site.Provider,
Selector: sel.Selector, Selector: sel.Selector,
} }
@ -415,15 +385,15 @@ func (suite *SPCollectionIntgSuite) TestCreateSharePointCollection_Libraries() {
) )
documentsColl, err := path.BuildPrefix( documentsColl, err := path.BuildPrefix(
suite.connector.tenant, suite.m365.TenantID,
siteID, suite.m365.Site.ID,
path.SharePointService, path.SharePointService,
path.LibrariesCategory) path.LibrariesCategory)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
metadataColl, err := path.BuildMetadata( metadataColl, err := path.BuildMetadata(
suite.connector.tenant, suite.m365.TenantID,
siteID, suite.m365.Site.ID,
path.SharePointService, path.SharePointService,
path.LibrariesCategory, path.LibrariesCategory,
false) false)
@ -450,24 +420,19 @@ func (suite *SPCollectionIntgSuite) TestCreateSharePointCollection_Lists() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
var ( ctrl := newController(ctx, t, path.SharePointService)
siteID = tconfig.M365SiteID(t)
ctrl = newController(ctx, t, path.SharePointService)
siteIDs = []string{siteID}
)
site, err := ctrl.PopulateProtectedResourceIDAndName(ctx, siteID, nil) _, err := ctrl.PopulateProtectedResourceIDAndName(ctx, suite.m365.Site.ID, nil)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
sel := selectors.NewSharePointBackup(siteIDs) sel := selectors.NewSharePointBackup([]string{suite.m365.Site.ID})
sel.Include(sel.Lists(selectors.Any())) sel.Include(sel.Lists(selectors.Any()))
sel.SetDiscreteOwnerIDName(suite.m365.Site.ID, suite.m365.Site.WebURL)
sel.SetDiscreteOwnerIDName(site.ID(), site.Name())
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: site, ProtectedResource: suite.m365.Site.Provider,
Selector: sel.Selector, Selector: sel.Selector,
} }
@ -502,9 +467,7 @@ func (suite *SPCollectionIntgSuite) TestCreateSharePointCollection_Lists() {
type GroupsCollectionIntgSuite struct { type GroupsCollectionIntgSuite struct {
tester.Suite tester.Suite
connector *Controller m365 its.M365IntgTestSetup
tenantID string
user string
} }
func TestGroupsCollectionIntgSuite(t *testing.T) { func TestGroupsCollectionIntgSuite(t *testing.T) {
@ -516,21 +479,7 @@ func TestGroupsCollectionIntgSuite(t *testing.T) {
} }
func (suite *GroupsCollectionIntgSuite) SetupSuite() { func (suite *GroupsCollectionIntgSuite) SetupSuite() {
t := suite.T() suite.m365 = its.GetM365(suite.T())
ctx, flush := tester.NewContext(t)
defer flush()
suite.connector = newController(ctx, t, path.GroupsService)
suite.user = tconfig.M365UserID(t)
acct := tconfig.NewM365Account(t)
creds, err := acct.M365Config()
require.NoError(t, err, clues.ToCore(err))
suite.tenantID = creds.AzureTenantID
tester.LogTimeOfTest(t)
} }
func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint() { func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint() {
@ -539,24 +488,19 @@ func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
var ( ctrl := newController(ctx, t, path.GroupsService)
groupID = tconfig.M365TeamID(t)
ctrl = newController(ctx, t, path.GroupsService)
groupIDs = []string{groupID}
)
group, err := ctrl.PopulateProtectedResourceIDAndName(ctx, groupID, nil) _, err := ctrl.PopulateProtectedResourceIDAndName(ctx, suite.m365.Group.ID, nil)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
sel := selectors.NewGroupsBackup(groupIDs) sel := selectors.NewGroupsBackup([]string{suite.m365.Group.ID})
sel.Include(sel.LibraryFolders([]string{"test"}, selectors.PrefixMatch())) sel.Include(sel.LibraryFolders([]string{"test"}, selectors.PrefixMatch()))
sel.SetDiscreteOwnerIDName(suite.m365.Group.ID, suite.m365.Group.DisplayName)
sel.SetDiscreteOwnerIDName(group.ID(), group.Name())
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: group, ProtectedResource: suite.m365.Group.Provider,
Selector: sel.Selector, Selector: sel.Selector,
} }
@ -575,8 +519,8 @@ func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint()
assert.Greater(t, len(collections), 1) assert.Greater(t, len(collections), 1)
p, err := path.BuildMetadata( p, err := path.BuildMetadata(
suite.tenantID, suite.m365.TenantID,
groupID, suite.m365.Group.ID,
path.GroupsService, path.GroupsService,
path.LibrariesCategory, path.LibrariesCategory,
false) false)
@ -614,31 +558,23 @@ func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint_In
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
var ( ctrl := newController(ctx, t, path.GroupsService)
groupID = tconfig.M365TeamID(t)
ctrl = newController(ctx, t, path.GroupsService)
groupIDs = []string{groupID}
)
group, err := ctrl.PopulateProtectedResourceIDAndName(ctx, groupID, nil) _, err := ctrl.PopulateProtectedResourceIDAndName(ctx, suite.m365.Group.ID, nil)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
sel := selectors.NewGroupsBackup(groupIDs) sel := selectors.NewGroupsBackup([]string{suite.m365.Group.ID})
sel.Include(sel.LibraryFolders([]string{"test"}, selectors.PrefixMatch())) sel.Include(sel.LibraryFolders([]string{"test"}, selectors.PrefixMatch()))
sel.SetDiscreteOwnerIDName(suite.m365.Group.ID, suite.m365.Group.DisplayName)
sel.SetDiscreteOwnerIDName(group.ID(), group.Name())
site, err := suite.connector.AC.Groups().GetRootSite(ctx, groupID)
require.NoError(t, err, clues.ToCore(err))
pth, err := path.Build( pth, err := path.Build(
suite.tenantID, suite.m365.TenantID,
groupID, suite.m365.Group.ID,
path.GroupsService, path.GroupsService,
path.LibrariesCategory, path.LibrariesCategory,
true, true,
odConsts.SitesPathDir, odConsts.SitesPathDir,
ptr.Val(site.GetId())) suite.m365.Group.RootSite.ID)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
mmc := []data.RestoreCollection{ mmc := []data.RestoreCollection{
@ -656,7 +592,7 @@ func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint_In
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: group, ProtectedResource: suite.m365.Group.Provider,
Selector: sel.Selector, Selector: sel.Selector,
MetadataCollections: mmc, MetadataCollections: mmc,
} }
@ -676,8 +612,8 @@ func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint_In
assert.Greater(t, len(collections), 1) assert.Greater(t, len(collections), 1)
p, err := path.BuildMetadata( p, err := path.BuildMetadata(
suite.tenantID, suite.m365.TenantID,
groupID, suite.m365.Group.ID,
path.GroupsService, path.GroupsService,
path.LibrariesCategory, path.LibrariesCategory,
false) false)
@ -690,13 +626,13 @@ func (suite *GroupsCollectionIntgSuite) TestCreateGroupsCollection_SharePoint_In
foundRootTombstone := false foundRootTombstone := false
sp, err := path.BuildPrefix( sp, err := path.BuildPrefix(
suite.tenantID, suite.m365.TenantID,
groupID, suite.m365.Group.ID,
path.GroupsService, path.GroupsService,
path.LibrariesCategory) path.LibrariesCategory)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
sp, err = sp.Append(false, odConsts.SitesPathDir, ptr.Val(site.GetId())) sp, err = sp.Append(false, odConsts.SitesPathDir, suite.m365.Group.RootSite.ID)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
for _, coll := range collections { for _, coll := range collections {

View File

@ -16,7 +16,6 @@ import (
"github.com/alcionai/corso/src/internal/common/idname" "github.com/alcionai/corso/src/internal/common/idname"
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
"github.com/alcionai/corso/src/internal/m365/support" "github.com/alcionai/corso/src/internal/m365/support"
"github.com/alcionai/corso/src/internal/observe" "github.com/alcionai/corso/src/internal/observe"
"github.com/alcionai/corso/src/pkg/backup/details" "github.com/alcionai/corso/src/pkg/backup/details"
@ -29,6 +28,7 @@ import (
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
"github.com/alcionai/corso/src/pkg/services/m365/custom" "github.com/alcionai/corso/src/pkg/services/m365/custom"
) )
@ -366,7 +366,7 @@ func downloadContent(
itemID := ptr.Val(item.GetId()) itemID := ptr.Val(item.GetId())
ctx = clues.Add(ctx, "item_id", itemID) ctx = clues.Add(ctx, "item_id", itemID)
content, err := downloadItem(ctx, iaag, item) content, err := downloadItem(ctx, iaag, driveID, item)
if err == nil { if err == nil {
return content, nil return content, nil
} else if !graph.IsErrUnauthorizedOrBadToken(err) { } else if !graph.IsErrUnauthorizedOrBadToken(err) {
@ -395,7 +395,7 @@ func downloadContent(
cdi := custom.ToCustomDriveItem(di) cdi := custom.ToCustomDriveItem(di)
content, err = downloadItem(ctx, iaag, cdi) content, err = downloadItem(ctx, iaag, driveID, cdi)
if err != nil { if err != nil {
return nil, clues.Wrap(err, "content download retry") return nil, clues.Wrap(err, "content download retry")
} }
@ -426,7 +426,7 @@ func readItemContents(
return nil, core.ErrNotFound return nil, core.ErrNotFound
} }
rc, err := downloadFile(ctx, iaag, props.downloadURL) rc, err := downloadFile(ctx, iaag, props.downloadURL, false)
if graph.IsErrUnauthorizedOrBadToken(err) { if graph.IsErrUnauthorizedOrBadToken(err) {
logger.CtxErr(ctx, err).Debug("stale item in cache") logger.CtxErr(ctx, err).Debug("stale item in cache")
} }

View File

@ -21,7 +21,7 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/common/readers" "github.com/alcionai/corso/src/internal/common/readers"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata" odmetadata "github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
metaTD "github.com/alcionai/corso/src/internal/m365/collection/drive/metadata/testdata" metaTD "github.com/alcionai/corso/src/internal/m365/collection/drive/metadata/testdata"
odTD "github.com/alcionai/corso/src/internal/m365/service/onedrive/testdata" odTD "github.com/alcionai/corso/src/internal/m365/service/onedrive/testdata"
"github.com/alcionai/corso/src/internal/m365/support" "github.com/alcionai/corso/src/internal/m365/support"
@ -34,6 +34,7 @@ import (
"github.com/alcionai/corso/src/pkg/fault" "github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
"github.com/alcionai/corso/src/pkg/services/m365/custom" "github.com/alcionai/corso/src/pkg/services/m365/custom"
) )
@ -73,13 +74,13 @@ func (suite *CollectionUnitSuite) TestCollection() {
stubMetaID = "testMetaID" stubMetaID = "testMetaID"
stubMetaEntityID = "email@provider.com" stubMetaEntityID = "email@provider.com"
stubMetaRoles = []string{"read", "write"} stubMetaRoles = []string{"read", "write"}
stubMeta = metadata.Metadata{ stubMeta = odmetadata.Metadata{
FileName: stubItemName, FileName: stubItemName,
Permissions: []metadata.Permission{ Permissions: []odmetadata.Permission{
{ {
ID: stubMetaID, ID: stubMetaID,
EntityID: stubMetaEntityID, EntityID: stubMetaEntityID,
EntityType: metadata.GV2User, EntityType: odmetadata.GV2User,
Roles: stubMetaRoles, Roles: stubMetaRoles,
Expiration: &now, Expiration: &now,
}, },
@ -208,7 +209,7 @@ func (suite *CollectionUnitSuite) TestCollection() {
mbh.GetErrs = []error{test.getErr} mbh.GetErrs = []error{test.getErr}
mbh.GI = getsItem{Err: assert.AnError} mbh.GI = getsItem{Err: assert.AnError}
pcr := metaTD.NewStubPermissionResponse(metadata.GV2User, stubMetaID, stubMetaEntityID, stubMetaRoles) pcr := metaTD.NewStubPermissionResponse(odmetadata.GV2User, stubMetaID, stubMetaEntityID, stubMetaRoles)
mbh.GIP = getsItemPermission{Perm: pcr} mbh.GIP = getsItemPermission{Perm: pcr}
coll, err := NewCollection( coll, err := NewCollection(
@ -294,7 +295,7 @@ func (suite *CollectionUnitSuite) TestCollection() {
assert.Equal(t, readers.DefaultSerializationVersion, rr.Format().Version) assert.Equal(t, readers.DefaultSerializationVersion, rr.Format().Version)
assert.False(t, rr.Format().DelInFlight) assert.False(t, rr.Format().DelInFlight)
readMeta := metadata.Metadata{} readMeta := odmetadata.Metadata{}
err = json.NewDecoder(rr).Decode(&readMeta) err = json.NewDecoder(rr).Decode(&readMeta)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))

View File

@ -14,7 +14,6 @@ import (
"github.com/alcionai/corso/src/internal/common/prefixmatcher" "github.com/alcionai/corso/src/internal/common/prefixmatcher"
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
odConsts "github.com/alcionai/corso/src/internal/m365/service/onedrive/consts" odConsts "github.com/alcionai/corso/src/internal/m365/service/onedrive/consts"
"github.com/alcionai/corso/src/internal/m365/support" "github.com/alcionai/corso/src/internal/m365/support"
bupMD "github.com/alcionai/corso/src/pkg/backup/metadata" bupMD "github.com/alcionai/corso/src/pkg/backup/metadata"
@ -26,6 +25,7 @@ import (
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
"github.com/alcionai/corso/src/pkg/services/m365/api/pagers" "github.com/alcionai/corso/src/pkg/services/m365/api/pagers"
"github.com/alcionai/corso/src/pkg/services/m365/custom" "github.com/alcionai/corso/src/pkg/services/m365/custom"
) )

View File

@ -9,9 +9,9 @@ import (
"golang.org/x/exp/maps" "golang.org/x/exp/maps"
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
"github.com/alcionai/corso/src/pkg/logger" "github.com/alcionai/corso/src/pkg/logger"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
"github.com/alcionai/corso/src/pkg/services/m365/custom" "github.com/alcionai/corso/src/pkg/services/m365/custom"
) )

View File

@ -7,13 +7,13 @@ import (
"github.com/alcionai/clues" "github.com/alcionai/clues"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
"github.com/alcionai/corso/src/internal/version" "github.com/alcionai/corso/src/internal/version"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/export" "github.com/alcionai/corso/src/pkg/export"
"github.com/alcionai/corso/src/pkg/fault" "github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/metrics" "github.com/alcionai/corso/src/pkg/metrics"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
) )
func NewExportCollection( func NewExportCollection(

View File

@ -12,9 +12,9 @@ import (
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
dataMock "github.com/alcionai/corso/src/internal/data/mock" dataMock "github.com/alcionai/corso/src/internal/data/mock"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/version" "github.com/alcionai/corso/src/internal/version"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
) )
type ExportUnitSuite struct { type ExportUnitSuite struct {

View File

@ -19,12 +19,9 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
dataMock "github.com/alcionai/corso/src/internal/data/mock" dataMock "github.com/alcionai/corso/src/internal/data/mock"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
odConsts "github.com/alcionai/corso/src/internal/m365/service/onedrive/consts" odConsts "github.com/alcionai/corso/src/internal/m365/service/onedrive/consts"
"github.com/alcionai/corso/src/internal/m365/support" "github.com/alcionai/corso/src/internal/m365/support"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/account"
"github.com/alcionai/corso/src/pkg/backup/details" "github.com/alcionai/corso/src/pkg/backup/details"
bupMD "github.com/alcionai/corso/src/pkg/backup/metadata" bupMD "github.com/alcionai/corso/src/pkg/backup/metadata"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
@ -34,6 +31,7 @@ import (
"github.com/alcionai/corso/src/pkg/selectors" "github.com/alcionai/corso/src/pkg/selectors"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
apiMock "github.com/alcionai/corso/src/pkg/services/m365/api/mock" apiMock "github.com/alcionai/corso/src/pkg/services/m365/api/mock"
"github.com/alcionai/corso/src/pkg/services/m365/api/pagers" "github.com/alcionai/corso/src/pkg/services/m365/api/pagers"
"github.com/alcionai/corso/src/pkg/services/m365/custom" "github.com/alcionai/corso/src/pkg/services/m365/custom"
@ -41,50 +39,6 @@ import (
const defaultFileSize int64 = 42 const defaultFileSize int64 = 42
// TODO(ashmrtn): Merge with similar structs in graph and exchange packages.
type oneDriveService struct {
credentials account.M365Config
status support.ControllerOperationStatus
ac api.Client
}
func newOneDriveService(credentials account.M365Config) (*oneDriveService, error) {
ac, err := api.NewClient(
credentials,
control.DefaultOptions(),
count.New())
if err != nil {
return nil, err
}
service := oneDriveService{
ac: ac,
credentials: credentials,
}
return &service, nil
}
func (ods *oneDriveService) updateStatus(status *support.ControllerOperationStatus) {
if status == nil {
return
}
ods.status = support.MergeStatus(ods.status, *status)
}
func loadTestService(t *testing.T) *oneDriveService {
a := tconfig.NewM365Account(t)
creds, err := a.M365Config()
require.NoError(t, err, clues.ToCore(err))
service, err := newOneDriveService(creds)
require.NoError(t, err, clues.ToCore(err))
return service
}
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
// collections // collections
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
@ -841,7 +795,12 @@ func (h mockBackupHandler[T]) AugmentItemInfo(
return h.ItemInfo return h.ItemInfo
} }
func (h *mockBackupHandler[T]) Get(context.Context, string, map[string]string) (*http.Response, error) { func (h *mockBackupHandler[T]) Get(
context.Context,
string,
map[string]string,
bool,
) (*http.Response, error) {
c := h.getCall c := h.getCall
h.getCall++ h.getCall++

View File

@ -21,8 +21,10 @@ import (
) )
const ( const (
acceptHeaderKey = "Accept" acceptHeaderKey = "Accept"
acceptHeaderValue = "*/*" acceptHeaderValue = "*/*"
gigabyte = 1024 * 1024 * 1024
largeFileDownloadLimit = 15 * gigabyte
) )
// downloadUrlKeys is used to find the download URL in a DriveItem response. // downloadUrlKeys is used to find the download URL in a DriveItem response.
@ -33,7 +35,8 @@ var downloadURLKeys = []string{
func downloadItem( func downloadItem(
ctx context.Context, ctx context.Context,
ag api.Getter, getter api.Getter,
driveID string,
item *custom.DriveItem, item *custom.DriveItem,
) (io.ReadCloser, error) { ) (io.ReadCloser, error) {
if item == nil { if item == nil {
@ -41,36 +44,37 @@ func downloadItem(
} }
var ( var (
rc io.ReadCloser // very large file content needs to be downloaded through a different endpoint, or else
isFile = item.GetFile() != nil // the download could take longer than the lifespan of the download token in the cached
err error // url, which will cause us to timeout on every download request, even if we refresh the
// download url right before the query.
url = "https://graph.microsoft.com/v1.0/drives/" + driveID + "/items/" + ptr.Val(item.GetId()) + "/content"
reader io.ReadCloser
err error
isLargeFile = ptr.Val(item.GetSize()) > largeFileDownloadLimit
) )
if isFile { // if this isn't a file, no content is available for download
var ( if item.GetFile() == nil {
url string return reader, nil
ad = item.GetAdditionalData()
)
for _, key := range downloadURLKeys {
if v, err := str.AnyValueToString(key, ad); err == nil {
url = v
break
}
}
rc, err = downloadFile(ctx, ag, url)
if err != nil {
return nil, clues.Stack(err)
}
} }
return rc, nil // smaller files will maintain our current behavior (prefetching the download url with the
// url cache). That pattern works for us in general, and we only need to deviate for very
// large file sizes.
if !isLargeFile {
url = str.FirstIn(item.GetAdditionalData(), downloadURLKeys...)
}
reader, err = downloadFile(ctx, getter, url, isLargeFile)
return reader, clues.StackWC(ctx, err).OrNil()
} }
type downloadWithRetries struct { type downloadWithRetries struct {
getter api.Getter getter api.Getter
url string requireAuth bool
url string
} }
func (dg *downloadWithRetries) SupportsRange() bool { func (dg *downloadWithRetries) SupportsRange() bool {
@ -86,7 +90,7 @@ func (dg *downloadWithRetries) Get(
// wouldn't work without it (get 416 responses instead of 206). // wouldn't work without it (get 416 responses instead of 206).
headers[acceptHeaderKey] = acceptHeaderValue headers[acceptHeaderKey] = acceptHeaderValue
resp, err := dg.getter.Get(ctx, dg.url, headers) resp, err := dg.getter.Get(ctx, dg.url, headers, dg.requireAuth)
if err != nil { if err != nil {
return nil, clues.Wrap(err, "getting file") return nil, clues.Wrap(err, "getting file")
} }
@ -96,7 +100,7 @@ func (dg *downloadWithRetries) Get(
resp.Body.Close() resp.Body.Close()
} }
return nil, clues.New("malware detected").Label(graph.LabelsMalware) return nil, clues.NewWC(ctx, "malware detected").Label(graph.LabelsMalware)
} }
if resp != nil && (resp.StatusCode/100) != 2 { if resp != nil && (resp.StatusCode/100) != 2 {
@ -107,7 +111,7 @@ func (dg *downloadWithRetries) Get(
// upstream error checks can compare the status with // upstream error checks can compare the status with
// clues.HasLabel(err, graph.LabelStatus(http.KnownStatusCode)) // clues.HasLabel(err, graph.LabelStatus(http.KnownStatusCode))
return nil, clues. return nil, clues.
Wrap(clues.New(resp.Status), "non-2xx http response"). Wrap(clues.NewWC(ctx, resp.Status), "non-2xx http response").
Label(graph.LabelStatus(resp.StatusCode)) Label(graph.LabelStatus(resp.StatusCode))
} }
@ -118,6 +122,7 @@ func downloadFile(
ctx context.Context, ctx context.Context,
ag api.Getter, ag api.Getter,
url string, url string,
requireAuth bool,
) (io.ReadCloser, error) { ) (io.ReadCloser, error) {
if len(url) == 0 { if len(url) == 0 {
return nil, clues.NewWC(ctx, "empty file url") return nil, clues.NewWC(ctx, "empty file url")
@ -141,8 +146,9 @@ func downloadFile(
rc, err := readers.NewResetRetryHandler( rc, err := readers.NewResetRetryHandler(
ctx, ctx,
&downloadWithRetries{ &downloadWithRetries{
getter: ag, getter: ag,
url: url, requireAuth: requireAuth,
url: url,
}) })
return rc, clues.Stack(err).OrNil() return rc, clues.Stack(err).OrNil()

View File

@ -12,6 +12,7 @@ import (
"github.com/alcionai/corso/src/internal/common/idname" "github.com/alcionai/corso/src/internal/common/idname"
"github.com/alcionai/corso/src/internal/common/prefixmatcher" "github.com/alcionai/corso/src/internal/common/prefixmatcher"
"github.com/alcionai/corso/src/internal/m365/support"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/account" "github.com/alcionai/corso/src/pkg/account"
@ -153,7 +154,8 @@ func (suite *ItemCollectorUnitSuite) TestDrives() {
{ {
Values: nil, Values: nil,
NextLink: nil, NextLink: nil,
Err: graph.Stack(ctx, mySiteURLNotFound), // needs graph.Stack, not clues.Stack
Err: graph.Stack(ctx, mySiteURLNotFound),
}, },
}, },
expectedErr: assert.NoError, expectedErr: assert.NoError,
@ -165,7 +167,8 @@ func (suite *ItemCollectorUnitSuite) TestDrives() {
{ {
Values: nil, Values: nil,
NextLink: nil, NextLink: nil,
Err: graph.Stack(ctx, mySiteNotFound), // needs graph.Stack, not clues.Stack
Err: graph.Stack(ctx, mySiteNotFound),
}, },
}, },
expectedErr: assert.NoError, expectedErr: assert.NoError,
@ -231,6 +234,18 @@ func (suite *OneDriveIntgSuite) SetupSuite() {
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
} }
type stubStatusUpdater struct {
status support.ControllerOperationStatus
}
func (ssu *stubStatusUpdater) updateStatus(status *support.ControllerOperationStatus) {
if status == nil {
return
}
ssu.status = support.MergeStatus(ssu.status, *status)
}
func (suite *OneDriveIntgSuite) TestOneDriveNewCollections() { func (suite *OneDriveIntgSuite) TestOneDriveNewCollections() {
creds, err := tconfig.NewM365Account(suite.T()).M365Config() creds, err := tconfig.NewM365Account(suite.T()).M365Config()
require.NoError(suite.T(), err, clues.ToCore(err)) require.NoError(suite.T(), err, clues.ToCore(err))
@ -256,10 +271,10 @@ func (suite *OneDriveIntgSuite) TestOneDriveNewCollections() {
defer flush() defer flush()
var ( var (
service = loadTestService(t) scope = selectors.
scope = selectors.
NewOneDriveBackup([]string{test.user}). NewOneDriveBackup([]string{test.user}).
AllData()[0] AllData()[0]
statusUpdater = stubStatusUpdater{}
) )
colls := NewCollections( colls := NewCollections(
@ -272,7 +287,7 @@ func (suite *OneDriveIntgSuite) TestOneDriveNewCollections() {
}, },
creds.AzureTenantID, creds.AzureTenantID,
idname.NewProvider(test.user, test.user), idname.NewProvider(test.user, test.user),
service.updateStatus, statusUpdater.updateStatus,
control.Options{ control.Options{
ToggleFeatures: control.Toggles{}, ToggleFeatures: control.Toggles{},
}, },

View File

@ -17,6 +17,7 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/common/str" "github.com/alcionai/corso/src/internal/common/str"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/control/testdata" "github.com/alcionai/corso/src/pkg/control/testdata"
@ -30,9 +31,7 @@ import (
type ItemIntegrationSuite struct { type ItemIntegrationSuite struct {
tester.Suite tester.Suite
user string m365 its.M365IntgTestSetup
userDriveID string
service *oneDriveService
} }
func TestItemIntegrationSuite(t *testing.T) { func TestItemIntegrationSuite(t *testing.T) {
@ -44,25 +43,7 @@ func TestItemIntegrationSuite(t *testing.T) {
} }
func (suite *ItemIntegrationSuite) SetupSuite() { func (suite *ItemIntegrationSuite) SetupSuite() {
t := suite.T() suite.m365 = its.GetM365(suite.T())
ctx, flush := tester.NewContext(t)
defer flush()
suite.service = loadTestService(t)
suite.user = tconfig.SecondaryM365UserID(t)
graph.InitializeConcurrencyLimiter(ctx, true, 4)
pager := suite.service.ac.Drives().NewUserDrivePager(suite.user, nil)
odDrives, err := api.GetAllDrives(ctx, pager)
require.NoError(t, err, clues.ToCore(err))
// Test Requirement 1: Need a drive
require.Greaterf(t, len(odDrives), 0, "user %s does not have a drive", suite.user)
// Pick the first drive
suite.userDriveID = ptr.Val(odDrives[0].GetId())
} }
func getOneDriveItem( func getOneDriveItem(
@ -103,28 +84,36 @@ func (suite *ItemIntegrationSuite) TestItemReader_oneDrive() {
defer flush() defer flush()
sc := selectors. sc := selectors.
NewOneDriveBackup([]string{suite.user}). NewOneDriveBackup([]string{suite.m365.User.ID}).
AllData()[0] AllData()[0]
driveItem := getOneDriveItem(ctx, t, suite.service.ac, suite.userDriveID) driveItem := getOneDriveItem(
ctx,
t,
suite.m365.AC,
suite.m365.User.DriveID)
// Test Requirement 2: Need a file // Test Requirement 2: Need a file
require.NotEmpty( require.NotEmpty(
t, t,
driveItem, driveItem,
"no file item found for user %s drive %s", "no file item found for user %q drive %q",
suite.user, suite.m365.User.ID,
suite.userDriveID) suite.m365.User.DriveID)
bh := &userDriveBackupHandler{ bh := &userDriveBackupHandler{
baseUserDriveHandler: baseUserDriveHandler{ baseUserDriveHandler: baseUserDriveHandler{
ac: suite.service.ac.Drives(), ac: suite.m365.AC.Drives(),
}, },
userID: suite.user, userID: suite.m365.User.ID,
scope: sc, scope: sc,
} }
// Read data for the file // Read data for the file
itemData, err := downloadItem(ctx, bh, custom.ToCustomDriveItem(driveItem)) itemData, err := downloadItem(
ctx,
bh,
suite.m365.User.DriveID,
custom.ToCustomDriveItem(driveItem))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
size, err := io.Copy(io.Discard, itemData) size, err := io.Copy(io.Discard, itemData)
@ -142,13 +131,13 @@ func (suite *ItemIntegrationSuite) TestIsURLExpired() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
driveItem := getOneDriveItem(ctx, t, suite.service.ac, suite.userDriveID) driveItem := getOneDriveItem(ctx, t, suite.m365.AC, suite.m365.User.DriveID)
require.NotEmpty( require.NotEmpty(
t, t,
driveItem, driveItem,
"no file item found for user %s drive %s", "no file item found for user %q drive %q",
suite.user, suite.m365.User.ID,
suite.userDriveID) suite.m365.User.DriveID)
var url string var url string
@ -173,7 +162,7 @@ func (suite *ItemIntegrationSuite) TestItemWriter() {
}{ }{
{ {
name: "", name: "",
driveID: suite.userDriveID, driveID: suite.m365.User.DriveID,
}, },
// { // {
// name: "sharePoint", // name: "sharePoint",
@ -183,12 +172,12 @@ func (suite *ItemIntegrationSuite) TestItemWriter() {
for _, test := range table { for _, test := range table {
suite.Run(test.name, func() { suite.Run(test.name, func() {
t := suite.T() t := suite.T()
rh := NewUserDriveRestoreHandler(suite.service.ac) rh := NewUserDriveRestoreHandler(suite.m365.AC)
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
root, err := suite.service.ac.Drives().GetRootFolder(ctx, test.driveID) root, err := suite.m365.AC.Drives().GetRootFolder(ctx, test.driveID)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
newFolderName := testdata.DefaultRestoreConfig("folder").Location newFolderName := testdata.DefaultRestoreConfig("folder").Location
@ -217,7 +206,7 @@ func (suite *ItemIntegrationSuite) TestItemWriter() {
// HACK: Leveraging this to test getFolder behavior for a file. `getFolder()` on the // HACK: Leveraging this to test getFolder behavior for a file. `getFolder()` on the
// newly created item should fail because it's a file not a folder // newly created item should fail because it's a file not a folder
_, err = suite.service.ac.Drives().GetFolderByName( _, err = suite.m365.AC.Drives().GetFolderByName(
ctx, ctx,
test.driveID, test.driveID,
ptr.Val(newFolder.GetId()), ptr.Val(newFolder.GetId()),
@ -261,7 +250,7 @@ func (suite *ItemIntegrationSuite) TestDriveGetFolder() {
}{ }{
{ {
name: "oneDrive", name: "oneDrive",
driveID: suite.userDriveID, driveID: suite.m365.User.DriveID,
}, },
// { // {
// name: "sharePoint", // name: "sharePoint",
@ -275,11 +264,11 @@ func (suite *ItemIntegrationSuite) TestDriveGetFolder() {
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
root, err := suite.service.ac.Drives().GetRootFolder(ctx, test.driveID) root, err := suite.m365.AC.Drives().GetRootFolder(ctx, test.driveID)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
// Lookup a folder that doesn't exist // Lookup a folder that doesn't exist
_, err = suite.service.ac.Drives().GetFolderByName( _, err = suite.m365.AC.Drives().GetFolderByName(
ctx, ctx,
test.driveID, test.driveID,
ptr.Val(root.GetId()), ptr.Val(root.GetId()),
@ -287,7 +276,7 @@ func (suite *ItemIntegrationSuite) TestDriveGetFolder() {
require.ErrorIs(t, err, api.ErrFolderNotFound, clues.ToCore(err)) require.ErrorIs(t, err, api.ErrFolderNotFound, clues.ToCore(err))
// Lookup a folder that does exist // Lookup a folder that does exist
_, err = suite.service.ac.Drives().GetFolderByName( _, err = suite.m365.AC.Drives().GetFolderByName(
ctx, ctx,
test.driveID, test.driveID,
ptr.Val(root.GetId()), ptr.Val(root.GetId()),
@ -307,6 +296,7 @@ func (m mockGetter) Get(
ctx context.Context, ctx context.Context,
url string, url string,
headers map[string]string, headers map[string]string,
requireAuth bool,
) (*http.Response, error) { ) (*http.Response, error) {
return m.GetFunc(ctx, url) return m.GetFunc(ctx, url)
} }
@ -394,7 +384,7 @@ func (suite *ItemUnitTestSuite) TestDownloadItem() {
return nil, clues.New("test error") return nil, clues.New("test error")
}, },
errorExpected: require.Error, errorExpected: require.Error,
rcExpected: require.Nil, rcExpected: require.NotNil,
}, },
{ {
name: "download url is empty", name: "download url is empty",
@ -431,7 +421,7 @@ func (suite *ItemUnitTestSuite) TestDownloadItem() {
}, nil }, nil
}, },
errorExpected: require.Error, errorExpected: require.Error,
rcExpected: require.Nil, rcExpected: require.NotNil,
}, },
{ {
name: "non-2xx http response", name: "non-2xx http response",
@ -450,7 +440,7 @@ func (suite *ItemUnitTestSuite) TestDownloadItem() {
}, nil }, nil
}, },
errorExpected: require.Error, errorExpected: require.Error,
rcExpected: require.Nil, rcExpected: require.NotNil,
}, },
} }
@ -463,9 +453,78 @@ func (suite *ItemUnitTestSuite) TestDownloadItem() {
mg := mockGetter{ mg := mockGetter{
GetFunc: test.GetFunc, GetFunc: test.GetFunc,
} }
rc, err := downloadItem(ctx, mg, custom.ToCustomDriveItem(test.itemFunc())) rc, err := downloadItem(
ctx,
mg,
"driveID",
custom.ToCustomDriveItem(test.itemFunc()))
test.errorExpected(t, err, clues.ToCore(err)) test.errorExpected(t, err, clues.ToCore(err))
test.rcExpected(t, rc) test.rcExpected(t, rc, "reader should only be nil if item is nil")
})
}
}
func (suite *ItemUnitTestSuite) TestDownloadItem_urlByFileSize() {
var (
testRc = io.NopCloser(bytes.NewReader([]byte("test")))
url = "https://example.com"
okResp = &http.Response{
StatusCode: http.StatusOK,
Body: testRc,
}
)
table := []struct {
name string
itemFunc func() models.DriveItemable
GetFunc func(ctx context.Context, url string) (*http.Response, error)
errorExpected require.ErrorAssertionFunc
rcExpected require.ValueAssertionFunc
label string
}{
{
name: "big file",
itemFunc: func() models.DriveItemable {
di := api.NewDriveItem("test", false)
di.SetAdditionalData(map[string]any{"@microsoft.graph.downloadUrl": url})
di.SetSize(ptr.To[int64](20 * gigabyte))
return di
},
GetFunc: func(ctx context.Context, url string) (*http.Response, error) {
assert.Contains(suite.T(), url, "/content")
return okResp, nil
},
},
{
name: "small file",
itemFunc: func() models.DriveItemable {
di := api.NewDriveItem("test", false)
di.SetAdditionalData(map[string]any{"@microsoft.graph.downloadUrl": url})
di.SetSize(ptr.To[int64](2 * gigabyte))
return di
},
GetFunc: func(ctx context.Context, url string) (*http.Response, error) {
assert.NotContains(suite.T(), url, "/content")
return okResp, nil
},
},
}
for _, test := range table {
suite.Run(test.name, func() {
t := suite.T()
ctx, flush := tester.NewContext(t)
defer flush()
_, err := downloadItem(
ctx,
mockGetter{GetFunc: test.GetFunc},
"driveID",
custom.ToCustomDriveItem(test.itemFunc()))
require.NoError(t, err, clues.ToCore(err))
}) })
} }
} }
@ -522,7 +581,11 @@ func (suite *ItemUnitTestSuite) TestDownloadItem_ConnectionResetErrorOnFirstRead
mg := mockGetter{ mg := mockGetter{
GetFunc: GetFunc, GetFunc: GetFunc,
} }
rc, err := downloadItem(ctx, mg, custom.ToCustomDriveItem(itemFunc())) rc, err := downloadItem(
ctx,
mg,
"driveID",
custom.ToCustomDriveItem(itemFunc()))
errorExpected(t, err, clues.ToCore(err)) errorExpected(t, err, clues.ToCore(err))
rcExpected(t, rc) rcExpected(t, rc)

View File

@ -10,12 +10,13 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/common/syncd" "github.com/alcionai/corso/src/internal/common/syncd"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata" odmetadata "github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
"github.com/alcionai/corso/src/internal/version" "github.com/alcionai/corso/src/internal/version"
"github.com/alcionai/corso/src/pkg/fault" "github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/logger" "github.com/alcionai/corso/src/pkg/logger"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
) )
// empty string is used to indicate that a permission cannot be restored // empty string is used to indicate that a permission cannot be restored
@ -23,20 +24,20 @@ const nonRestorablePermission = ""
func getParentMetadata( func getParentMetadata(
parentPath path.Path, parentPath path.Path,
parentDirToMeta syncd.MapTo[metadata.Metadata], parentDirToMeta syncd.MapTo[odmetadata.Metadata],
) (metadata.Metadata, error) { ) (odmetadata.Metadata, error) {
parentMeta, ok := parentDirToMeta.Load(parentPath.String()) parentMeta, ok := parentDirToMeta.Load(parentPath.String())
if !ok { if !ok {
drivePath, err := path.ToDrivePath(parentPath) drivePath, err := path.ToDrivePath(parentPath)
if err != nil { if err != nil {
return metadata.Metadata{}, clues.Wrap(err, "invalid restore path") return odmetadata.Metadata{}, clues.Wrap(err, "invalid restore path")
} }
if len(drivePath.Folders) != 0 { if len(drivePath.Folders) != 0 {
return metadata.Metadata{}, clues.Wrap(err, "computing item permissions") return odmetadata.Metadata{}, clues.Wrap(err, "computing item permissions")
} }
parentMeta = metadata.Metadata{} parentMeta = odmetadata.Metadata{}
} }
return parentMeta, nil return parentMeta, nil
@ -49,9 +50,9 @@ func getCollectionMetadata(
caches *restoreCaches, caches *restoreCaches,
backupVersion int, backupVersion int,
restorePerms bool, restorePerms bool,
) (metadata.Metadata, error) { ) (odmetadata.Metadata, error) {
if !restorePerms || backupVersion < version.OneDrive1DataAndMetaFiles { if !restorePerms || backupVersion < version.OneDrive1DataAndMetaFiles {
return metadata.Metadata{}, nil return odmetadata.Metadata{}, nil
} }
var ( var (
@ -61,13 +62,13 @@ func getCollectionMetadata(
if len(drivePath.Folders) == 0 { if len(drivePath.Folders) == 0 {
// No permissions for root folder // No permissions for root folder
return metadata.Metadata{}, nil return odmetadata.Metadata{}, nil
} }
if backupVersion < version.OneDrive4DirIncludesPermissions { if backupVersion < version.OneDrive4DirIncludesPermissions {
colMeta, err := getParentMetadata(fullPath, caches.ParentDirToMeta) colMeta, err := getParentMetadata(fullPath, caches.ParentDirToMeta)
if err != nil { if err != nil {
return metadata.Metadata{}, clues.Wrap(err, "collection metadata") return odmetadata.Metadata{}, clues.Wrap(err, "collection metadata")
} }
return colMeta, nil return colMeta, nil
@ -82,7 +83,7 @@ func getCollectionMetadata(
meta, err := FetchAndReadMetadata(ctx, dc, metaName) meta, err := FetchAndReadMetadata(ctx, dc, metaName)
if err != nil { if err != nil {
return metadata.Metadata{}, clues.Wrap(err, "collection metadata") return odmetadata.Metadata{}, clues.Wrap(err, "collection metadata")
} }
return meta, nil return meta, nil
@ -93,9 +94,9 @@ func getCollectionMetadata(
func computePreviousLinkShares( func computePreviousLinkShares(
ctx context.Context, ctx context.Context,
originDir path.Path, originDir path.Path,
parentMetas syncd.MapTo[metadata.Metadata], parentMetas syncd.MapTo[odmetadata.Metadata],
) ([]metadata.LinkShare, error) { ) ([]odmetadata.LinkShare, error) {
linkShares := []metadata.LinkShare{} linkShares := []odmetadata.LinkShare{}
ctx = clues.Add(ctx, "origin_dir", originDir) ctx = clues.Add(ctx, "origin_dir", originDir)
parent, err := originDir.Dir() parent, err := originDir.Dir()
@ -122,7 +123,7 @@ func computePreviousLinkShares(
// Any change in permissions would change it to custom // Any change in permissions would change it to custom
// permission set and so we can filter on that. // permission set and so we can filter on that.
if meta.SharingMode == metadata.SharingModeCustom { if meta.SharingMode == odmetadata.SharingModeCustom {
linkShares = append(linkShares, meta.LinkShares...) linkShares = append(linkShares, meta.LinkShares...)
} }
@ -143,11 +144,11 @@ func computePreviousMetadata(
ctx context.Context, ctx context.Context,
originDir path.Path, originDir path.Path,
// map parent dir -> parent's metadata // map parent dir -> parent's metadata
parentMetas syncd.MapTo[metadata.Metadata], parentMetas syncd.MapTo[odmetadata.Metadata],
) (metadata.Metadata, error) { ) (odmetadata.Metadata, error) {
var ( var (
parent path.Path parent path.Path
meta metadata.Metadata meta odmetadata.Metadata
err error err error
ok bool ok bool
@ -158,26 +159,26 @@ func computePreviousMetadata(
for { for {
parent, err = parent.Dir() parent, err = parent.Dir()
if err != nil { if err != nil {
return metadata.Metadata{}, clues.WrapWC(ctx, err, "getting parent") return odmetadata.Metadata{}, clues.WrapWC(ctx, err, "getting parent")
} }
ictx := clues.Add(ctx, "parent_dir", parent) ictx := clues.Add(ctx, "parent_dir", parent)
drivePath, err := path.ToDrivePath(parent) drivePath, err := path.ToDrivePath(parent)
if err != nil { if err != nil {
return metadata.Metadata{}, clues.WrapWC(ictx, err, "transforming dir to drivePath") return odmetadata.Metadata{}, clues.WrapWC(ictx, err, "transforming dir to drivePath")
} }
if len(drivePath.Folders) == 0 { if len(drivePath.Folders) == 0 {
return metadata.Metadata{}, nil return odmetadata.Metadata{}, nil
} }
meta, ok = parentMetas.Load(parent.String()) meta, ok = parentMetas.Load(parent.String())
if !ok { if !ok {
return metadata.Metadata{}, clues.NewWC(ictx, "no metadata found for parent folder: "+parent.String()) return odmetadata.Metadata{}, clues.NewWC(ictx, "no metadata found for parent folder: "+parent.String())
} }
if meta.SharingMode == metadata.SharingModeCustom { if meta.SharingMode == odmetadata.SharingModeCustom {
return meta, nil return meta, nil
} }
} }
@ -195,7 +196,7 @@ func UpdatePermissions(
udip updateDeleteItemPermissioner, udip updateDeleteItemPermissioner,
driveID string, driveID string,
itemID string, itemID string,
permAdded, permRemoved []metadata.Permission, permAdded, permRemoved []odmetadata.Permission,
oldPermIDToNewID syncd.MapTo[string], oldPermIDToNewID syncd.MapTo[string],
errs *fault.Bus, errs *fault.Bus,
) error { ) error {
@ -260,7 +261,7 @@ func UpdatePermissions(
// TODO: sitegroup support. Currently errors with "One or more users could not be resolved", // TODO: sitegroup support. Currently errors with "One or more users could not be resolved",
// likely due to the site group entityID consisting of a single integer (ex: 4) // likely due to the site group entityID consisting of a single integer (ex: 4)
if len(roles) == 0 || p.EntityType == metadata.GV2SiteGroup { if len(roles) == 0 || p.EntityType == odmetadata.GV2SiteGroup {
continue continue
} }
@ -315,7 +316,7 @@ func UpdateLinkShares(
upils updateDeleteItemLinkSharer, upils updateDeleteItemLinkSharer,
driveID string, driveID string,
itemID string, itemID string,
lsAdded, lsRemoved []metadata.LinkShare, lsAdded, lsRemoved []odmetadata.LinkShare,
oldLinkShareIDToNewID syncd.MapTo[string], oldLinkShareIDToNewID syncd.MapTo[string],
errs *fault.Bus, errs *fault.Bus,
) (bool, error) { ) (bool, error) {
@ -347,7 +348,7 @@ func UpdateLinkShares(
for _, iden := range ls.Entities { for _, iden := range ls.Entities {
// TODO: sitegroup support. Currently errors with "One or more users could not be resolved", // TODO: sitegroup support. Currently errors with "One or more users could not be resolved",
// likely due to the site group entityID consisting of a single integer (ex: 4) // likely due to the site group entityID consisting of a single integer (ex: 4)
if iden.EntityType == metadata.GV2SiteGroup { if iden.EntityType == odmetadata.GV2SiteGroup {
continue continue
} }
@ -457,11 +458,11 @@ func UpdateLinkShares(
func filterUnavailableEntitiesInLinkShare( func filterUnavailableEntitiesInLinkShare(
ctx context.Context, ctx context.Context,
linkShares []metadata.LinkShare, linkShares []odmetadata.LinkShare,
availableEntities ResourceIDNames, availableEntities ResourceIDNames,
oldLinkShareIDToNewID syncd.MapTo[string], oldLinkShareIDToNewID syncd.MapTo[string],
) []metadata.LinkShare { ) []odmetadata.LinkShare {
filtered := []metadata.LinkShare{} filtered := []odmetadata.LinkShare{}
if availableEntities.Users == nil || availableEntities.Groups == nil { if availableEntities.Users == nil || availableEntities.Groups == nil {
// This should not be happening unless we missed to fill in the caches // This should not be happening unless we missed to fill in the caches
@ -470,20 +471,20 @@ func filterUnavailableEntitiesInLinkShare(
} }
for _, p := range linkShares { for _, p := range linkShares {
entities := []metadata.Entity{} entities := []odmetadata.Entity{}
for _, e := range p.Entities { for _, e := range p.Entities {
available := false available := false
switch e.EntityType { switch e.EntityType {
case metadata.GV2User: case odmetadata.GV2User:
// Link shares with external users won't have IDs // Link shares with external users won't have IDs
if len(e.ID) == 0 && len(e.Email) > 0 { if len(e.ID) == 0 && len(e.Email) > 0 {
available = true available = true
} else { } else {
_, available = availableEntities.Users.NameOf(e.ID) _, available = availableEntities.Users.NameOf(e.ID)
} }
case metadata.GV2Group: case odmetadata.GV2Group:
_, available = availableEntities.Groups.NameOf(e.ID) _, available = availableEntities.Groups.NameOf(e.ID)
default: default:
// We only know about users and groups // We only know about users and groups
@ -513,26 +514,26 @@ func filterUnavailableEntitiesInLinkShare(
func filterUnavailableEntitiesInPermissions( func filterUnavailableEntitiesInPermissions(
ctx context.Context, ctx context.Context,
perms []metadata.Permission, perms []odmetadata.Permission,
availableEntities ResourceIDNames, availableEntities ResourceIDNames,
oldPermIDToNewID syncd.MapTo[string], oldPermIDToNewID syncd.MapTo[string],
) []metadata.Permission { ) []odmetadata.Permission {
if availableEntities.Users == nil || availableEntities.Groups == nil { if availableEntities.Users == nil || availableEntities.Groups == nil {
// This should not be happening unless we missed to fill in the caches // This should not be happening unless we missed to fill in the caches
logger.Ctx(ctx).Info("no available entities, not filtering link shares") logger.Ctx(ctx).Info("no available entities, not filtering link shares")
return perms return perms
} }
filtered := []metadata.Permission{} filtered := []odmetadata.Permission{}
for _, p := range perms { for _, p := range perms {
available := false available := false
switch p.EntityType { switch p.EntityType {
case metadata.GV2User: case odmetadata.GV2User:
_, ok := availableEntities.Users.NameOf(p.EntityID) _, ok := availableEntities.Users.NameOf(p.EntityID)
available = available || ok available = available || ok
case metadata.GV2Group: case odmetadata.GV2Group:
_, ok := availableEntities.Groups.NameOf(p.EntityID) _, ok := availableEntities.Groups.NameOf(p.EntityID)
available = available || ok available = available || ok
default: default:
@ -564,11 +565,11 @@ func RestorePermissions(
driveID string, driveID string,
itemID string, itemID string,
itemPath path.Path, itemPath path.Path,
current metadata.Metadata, current odmetadata.Metadata,
caches *restoreCaches, caches *restoreCaches,
errs *fault.Bus, errs *fault.Bus,
) { ) {
if current.SharingMode == metadata.SharingModeInherited { if current.SharingMode == odmetadata.SharingModeInherited {
return return
} }
@ -582,7 +583,7 @@ func RestorePermissions(
} }
if previousLinkShares != nil { if previousLinkShares != nil {
lsAdded, lsRemoved := metadata.DiffLinkShares(previousLinkShares, current.LinkShares) lsAdded, lsRemoved := odmetadata.DiffLinkShares(previousLinkShares, current.LinkShares)
lsAdded = filterUnavailableEntitiesInLinkShare(ctx, lsAdded, caches.AvailableEntities, caches.OldLinkShareIDToNewID) lsAdded = filterUnavailableEntitiesInLinkShare(ctx, lsAdded, caches.AvailableEntities, caches.OldLinkShareIDToNewID)
// Link shares have to be updated before permissions as we have to // Link shares have to be updated before permissions as we have to
@ -608,7 +609,7 @@ func RestorePermissions(
return return
} }
permAdded, permRemoved := metadata.DiffPermissions(previous.Permissions, current.Permissions) permAdded, permRemoved := odmetadata.DiffPermissions(previous.Permissions, current.Permissions)
permAdded = filterUnavailableEntitiesInPermissions(ctx, permAdded, caches.AvailableEntities, caches.OldPermIDToNewID) permAdded = filterUnavailableEntitiesInPermissions(ctx, permAdded, caches.AvailableEntities, caches.OldPermIDToNewID)
if didReset { if didReset {
@ -617,7 +618,7 @@ func RestorePermissions(
// that an item has as they too will be removed. // that an item has as they too will be removed.
logger.Ctx(ctx).Debug("link share creation reset all inherited permissions") logger.Ctx(ctx).Debug("link share creation reset all inherited permissions")
permRemoved = []metadata.Permission{} permRemoved = []odmetadata.Permission{}
permAdded = current.Permissions permAdded = current.Permissions
} }

View File

@ -17,7 +17,7 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/data" "github.com/alcionai/corso/src/internal/data"
"github.com/alcionai/corso/src/internal/diagnostics" "github.com/alcionai/corso/src/internal/diagnostics"
"github.com/alcionai/corso/src/internal/m365/collection/drive/metadata" odmetadata "github.com/alcionai/corso/src/internal/m365/collection/drive/metadata"
"github.com/alcionai/corso/src/internal/m365/support" "github.com/alcionai/corso/src/internal/m365/support"
"github.com/alcionai/corso/src/internal/observe" "github.com/alcionai/corso/src/internal/observe"
"github.com/alcionai/corso/src/internal/operations/inject" "github.com/alcionai/corso/src/internal/operations/inject"
@ -31,6 +31,7 @@ import (
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph/metadata"
"github.com/alcionai/corso/src/pkg/services/m365/custom" "github.com/alcionai/corso/src/pkg/services/m365/custom"
) )
@ -552,7 +553,7 @@ func CreateRestoreFolders(
drivePath *path.DrivePath, drivePath *path.DrivePath,
restoreDir *path.Builder, restoreDir *path.Builder,
folderPath path.Path, folderPath path.Path,
folderMetadata metadata.Metadata, folderMetadata odmetadata.Metadata,
caches *restoreCaches, caches *restoreCaches,
restorePerms bool, restorePerms bool,
errs *fault.Bus, errs *fault.Bus,
@ -876,12 +877,12 @@ func FetchAndReadMetadata(
ctx context.Context, ctx context.Context,
fibn data.FetchItemByNamer, fibn data.FetchItemByNamer,
metaName string, metaName string,
) (metadata.Metadata, error) { ) (odmetadata.Metadata, error) {
ctx = clues.Add(ctx, "meta_file_name", metaName) ctx = clues.Add(ctx, "meta_file_name", metaName)
metaFile, err := fibn.FetchItemByName(ctx, metaName) metaFile, err := fibn.FetchItemByName(ctx, metaName)
if err != nil { if err != nil {
return metadata.Metadata{}, clues.Wrap(err, "getting item metadata") return odmetadata.Metadata{}, clues.Wrap(err, "getting item metadata")
} }
metaReader := metaFile.ToReader() metaReader := metaFile.ToReader()
@ -889,25 +890,25 @@ func FetchAndReadMetadata(
meta, err := getMetadata(metaReader) meta, err := getMetadata(metaReader)
if err != nil { if err != nil {
return metadata.Metadata{}, clues.Wrap(err, "deserializing item metadata") return odmetadata.Metadata{}, clues.Wrap(err, "deserializing item metadata")
} }
return meta, nil return meta, nil
} }
// getMetadata read and parses the metadata info for an item // getMetadata read and parses the metadata info for an item
func getMetadata(metar io.ReadCloser) (metadata.Metadata, error) { func getMetadata(metar io.ReadCloser) (odmetadata.Metadata, error) {
var meta metadata.Metadata var meta odmetadata.Metadata
// `metar` will be nil for the top level container folder // `metar` will be nil for the top level container folder
if metar != nil { if metar != nil {
metaraw, err := io.ReadAll(metar) metaraw, err := io.ReadAll(metar)
if err != nil { if err != nil {
return metadata.Metadata{}, err return odmetadata.Metadata{}, err
} }
err = json.Unmarshal(metaraw, &meta) err = json.Unmarshal(metaraw, &meta)
if err != nil { if err != nil {
return metadata.Metadata{}, err return odmetadata.Metadata{}, err
} }
} }

View File

@ -93,8 +93,9 @@ func (h siteBackupHandler) Get(
ctx context.Context, ctx context.Context,
url string, url string,
headers map[string]string, headers map[string]string,
requireAuth bool,
) (*http.Response, error) { ) (*http.Response, error) {
return h.ac.Get(ctx, url, headers) return h.ac.Get(ctx, url, headers, requireAuth)
} }
func (h siteBackupHandler) PathPrefix( func (h siteBackupHandler) PathPrefix(

View File

@ -18,6 +18,7 @@ import (
"github.com/alcionai/corso/src/internal/common/ptr" "github.com/alcionai/corso/src/internal/common/ptr"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/control/testdata" "github.com/alcionai/corso/src/pkg/control/testdata"
@ -34,9 +35,7 @@ import (
type URLCacheIntegrationSuite struct { type URLCacheIntegrationSuite struct {
tester.Suite tester.Suite
ac api.Client m365 its.M365IntgTestSetup
user string
driveID string
} }
func TestURLCacheIntegrationSuite(t *testing.T) { func TestURLCacheIntegrationSuite(t *testing.T) {
@ -49,29 +48,12 @@ func TestURLCacheIntegrationSuite(t *testing.T) {
func (suite *URLCacheIntegrationSuite) SetupSuite() { func (suite *URLCacheIntegrationSuite) SetupSuite() {
t := suite.T() t := suite.T()
suite.m365 = its.GetM365(t)
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
graph.InitializeConcurrencyLimiter(ctx, true, 4) graph.InitializeConcurrencyLimiter(ctx, true, 4)
suite.user = tconfig.SecondaryM365UserID(t)
acct := tconfig.NewM365Account(t)
creds, err := acct.M365Config()
require.NoError(t, err, clues.ToCore(err))
suite.ac, err = api.NewClient(
creds,
control.DefaultOptions(),
count.New())
require.NoError(t, err, clues.ToCore(err))
drive, err := suite.ac.Users().GetDefaultDrive(ctx, suite.user)
require.NoError(t, err, clues.ToCore(err))
suite.driveID = ptr.Val(drive.GetId())
} }
// Basic test for urlCache. Create some files in onedrive, then access them via // Basic test for urlCache. Create some files in onedrive, then access them via
@ -79,22 +61,18 @@ func (suite *URLCacheIntegrationSuite) SetupSuite() {
func (suite *URLCacheIntegrationSuite) TestURLCacheBasic() { func (suite *URLCacheIntegrationSuite) TestURLCacheBasic() {
var ( var (
t = suite.T() t = suite.T()
ac = suite.ac.Drives() ac = suite.m365.AC.Drives()
driveID = suite.driveID driveID = suite.m365.User.DriveID
newFolderName = testdata.DefaultRestoreConfig("folder").Location newFolderName = testdata.DefaultRestoreConfig("folder").Location
) )
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
// Create a new test folder
root, err := ac.GetRootFolder(ctx, driveID)
require.NoError(t, err, clues.ToCore(err))
newFolder, err := ac.PostItemInContainer( newFolder, err := ac.PostItemInContainer(
ctx, ctx,
driveID, driveID,
ptr.Val(root.GetId()), suite.m365.User.DriveRootFolderID,
api.NewDriveItem(newFolderName, true), api.NewDriveItem(newFolderName, true),
control.Copy) control.Copy)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
@ -105,7 +83,7 @@ func (suite *URLCacheIntegrationSuite) TestURLCacheBasic() {
// Get the previous delta to feed into url cache // Get the previous delta to feed into url cache
pager := ac.EnumerateDriveItemsDelta( pager := ac.EnumerateDriveItemsDelta(
ctx, ctx,
suite.driveID, driveID,
"", "",
api.CallConfig{ api.CallConfig{
Select: api.URLCacheDriveItemProps(), Select: api.URLCacheDriveItemProps(),
@ -142,10 +120,10 @@ func (suite *URLCacheIntegrationSuite) TestURLCacheBasic() {
// Create a new URL cache with a long TTL // Create a new URL cache with a long TTL
uc, err := newURLCache( uc, err := newURLCache(
suite.driveID, driveID,
du.URL, du.URL,
1*time.Hour, 1*time.Hour,
suite.ac.Drives(), ac,
count.New(), count.New(),
fault.New(true)) fault.New(true))
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
@ -176,7 +154,8 @@ func (suite *URLCacheIntegrationSuite) TestURLCacheBasic() {
http.MethodGet, http.MethodGet,
props.downloadURL, props.downloadURL,
nil, nil,
nil) nil,
false)
require.NoError(t, err, clues.ToCore(err)) require.NoError(t, err, clues.ToCore(err))
require.NotNil(t, resp) require.NotNil(t, resp)

View File

@ -93,8 +93,9 @@ func (h userDriveBackupHandler) Get(
ctx context.Context, ctx context.Context,
url string, url string,
headers map[string]string, headers map[string]string,
requireAuth bool,
) (*http.Response, error) { ) (*http.Response, error) {
return h.ac.Get(ctx, url, headers) return h.ac.Get(ctx, url, headers, requireAuth)
} }
func (h userDriveBackupHandler) PathPrefix( func (h userDriveBackupHandler) PathPrefix(

View File

@ -296,6 +296,7 @@ func populateCollections(
cl), cl),
qp.ProtectedResource.ID(), qp.ProtectedResource.ID(),
bh.itemHandler(), bh.itemHandler(),
bh,
addAndRem.Added, addAndRem.Added,
addAndRem.Removed, addAndRem.Removed,
// TODO: produce a feature flag that allows selective // TODO: produce a feature flag that allows selective

View File

@ -24,6 +24,7 @@ import (
"github.com/alcionai/corso/src/internal/m365/support" "github.com/alcionai/corso/src/internal/m365/support"
"github.com/alcionai/corso/src/internal/operations/inject" "github.com/alcionai/corso/src/internal/operations/inject"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/internal/version" "github.com/alcionai/corso/src/internal/version"
"github.com/alcionai/corso/src/pkg/account" "github.com/alcionai/corso/src/pkg/account"
@ -87,6 +88,14 @@ func (bh mockBackupHandler) folderGetter() containerGetter { return
func (bh mockBackupHandler) previewIncludeContainers() []string { return bh.previewIncludes } func (bh mockBackupHandler) previewIncludeContainers() []string { return bh.previewIncludes }
func (bh mockBackupHandler) previewExcludeContainers() []string { return bh.previewExcludes } func (bh mockBackupHandler) previewExcludeContainers() []string { return bh.previewExcludes }
func (bh mockBackupHandler) CanSkipItemFailure(
err error,
resourceID string,
opts control.Options,
) (fault.SkipCause, bool) {
return "", false
}
func (bh mockBackupHandler) NewContainerCache( func (bh mockBackupHandler) NewContainerCache(
userID string, userID string,
) (string, graph.ContainerResolver) { ) (string, graph.ContainerResolver) {
@ -472,10 +481,7 @@ func newStatusUpdater(t *testing.T, wg *sync.WaitGroup) func(status *support.Con
type BackupIntgSuite struct { type BackupIntgSuite struct {
tester.Suite tester.Suite
user string m365 its.M365IntgTestSetup
site string
tenantID string
ac api.Client
} }
func TestBackupIntgSuite(t *testing.T) { func TestBackupIntgSuite(t *testing.T) {
@ -488,35 +494,18 @@ func TestBackupIntgSuite(t *testing.T) {
func (suite *BackupIntgSuite) SetupSuite() { func (suite *BackupIntgSuite) SetupSuite() {
t := suite.T() t := suite.T()
suite.m365 = its.GetM365(t)
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
graph.InitializeConcurrencyLimiter(ctx, true, 4) graph.InitializeConcurrencyLimiter(ctx, true, 4)
suite.user = tconfig.M365UserID(t)
suite.site = tconfig.M365SiteID(t)
acct := tconfig.NewM365Account(t)
creds, err := acct.M365Config()
require.NoError(t, err, clues.ToCore(err))
suite.ac, err = api.NewClient(
creds,
control.DefaultOptions(),
count.New())
require.NoError(t, err, clues.ToCore(err))
suite.tenantID = creds.AzureTenantID
tester.LogTimeOfTest(t)
} }
func (suite *BackupIntgSuite) TestMailFetch() { func (suite *BackupIntgSuite) TestMailFetch() {
var ( var (
userID = tconfig.M365UserID(suite.T()) users = []string{suite.m365.User.ID}
users = []string{userID} handlers = BackupHandlers(suite.m365.AC)
handlers = BackupHandlers(suite.ac)
) )
tests := []struct { tests := []struct {
@ -560,14 +549,14 @@ func (suite *BackupIntgSuite) TestMailFetch() {
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: ctrlOpts, Options: ctrlOpts,
ProtectedResource: inMock.NewProvider(userID, userID), ProtectedResource: suite.m365.User.Provider,
} }
collections, err := CreateCollections( collections, err := CreateCollections(
ctx, ctx,
bpc, bpc,
handlers, handlers,
suite.tenantID, suite.m365.TenantID,
test.scope, test.scope,
metadata.DeltaPaths{}, metadata.DeltaPaths{},
func(status *support.ControllerOperationStatus) {}, func(status *support.ControllerOperationStatus) {},
@ -602,9 +591,8 @@ func (suite *BackupIntgSuite) TestMailFetch() {
func (suite *BackupIntgSuite) TestDelta() { func (suite *BackupIntgSuite) TestDelta() {
var ( var (
userID = tconfig.M365UserID(suite.T()) users = []string{suite.m365.User.ID}
users = []string{userID} handlers = BackupHandlers(suite.m365.AC)
handlers = BackupHandlers(suite.ac)
) )
tests := []struct { tests := []struct {
@ -640,7 +628,7 @@ func (suite *BackupIntgSuite) TestDelta() {
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: inMock.NewProvider(userID, userID), ProtectedResource: suite.m365.User.Provider,
} }
// get collections without providing any delta history (ie: full backup) // get collections without providing any delta history (ie: full backup)
@ -648,7 +636,7 @@ func (suite *BackupIntgSuite) TestDelta() {
ctx, ctx,
bpc, bpc,
handlers, handlers,
suite.tenantID, suite.m365.TenantID,
test.scope, test.scope,
metadata.DeltaPaths{}, metadata.DeltaPaths{},
func(status *support.ControllerOperationStatus) {}, func(status *support.ControllerOperationStatus) {},
@ -681,7 +669,7 @@ func (suite *BackupIntgSuite) TestDelta() {
ctx, ctx,
bpc, bpc,
handlers, handlers,
suite.tenantID, suite.m365.TenantID,
test.scope, test.scope,
dps, dps,
func(status *support.ControllerOperationStatus) {}, func(status *support.ControllerOperationStatus) {},
@ -703,8 +691,8 @@ func (suite *BackupIntgSuite) TestMailSerializationRegression() {
var ( var (
wg sync.WaitGroup wg sync.WaitGroup
users = []string{suite.user} users = []string{suite.m365.User.ID}
handlers = BackupHandlers(suite.ac) handlers = BackupHandlers(suite.m365.AC)
) )
sel := selectors.NewExchangeBackup(users) sel := selectors.NewExchangeBackup(users)
@ -713,7 +701,7 @@ func (suite *BackupIntgSuite) TestMailSerializationRegression() {
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: inMock.NewProvider(suite.user, suite.user), ProtectedResource: suite.m365.User.Provider,
Selector: sel.Selector, Selector: sel.Selector,
} }
@ -721,7 +709,7 @@ func (suite *BackupIntgSuite) TestMailSerializationRegression() {
ctx, ctx,
bpc, bpc,
handlers, handlers,
suite.tenantID, suite.m365.TenantID,
sel.Scopes()[0], sel.Scopes()[0],
metadata.DeltaPaths{}, metadata.DeltaPaths{},
newStatusUpdater(t, &wg), newStatusUpdater(t, &wg),
@ -773,8 +761,8 @@ func (suite *BackupIntgSuite) TestMailSerializationRegression() {
// a regression test to ensure that downloaded items can be uploaded. // a regression test to ensure that downloaded items can be uploaded.
func (suite *BackupIntgSuite) TestContactSerializationRegression() { func (suite *BackupIntgSuite) TestContactSerializationRegression() {
var ( var (
users = []string{suite.user} users = []string{suite.m365.User.ID}
handlers = BackupHandlers(suite.ac) handlers = BackupHandlers(suite.m365.AC)
) )
tests := []struct { tests := []struct {
@ -801,14 +789,14 @@ func (suite *BackupIntgSuite) TestContactSerializationRegression() {
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: inMock.NewProvider(suite.user, suite.user), ProtectedResource: suite.m365.User.Provider,
} }
edcs, err := CreateCollections( edcs, err := CreateCollections(
ctx, ctx,
bpc, bpc,
handlers, handlers,
suite.tenantID, suite.m365.TenantID,
test.scope, test.scope,
metadata.DeltaPaths{}, metadata.DeltaPaths{},
newStatusUpdater(t, &wg), newStatusUpdater(t, &wg),
@ -875,8 +863,8 @@ func (suite *BackupIntgSuite) TestContactSerializationRegression() {
// to be able to successfully query, download and restore event objects // to be able to successfully query, download and restore event objects
func (suite *BackupIntgSuite) TestEventsSerializationRegression() { func (suite *BackupIntgSuite) TestEventsSerializationRegression() {
var ( var (
users = []string{suite.user} users = []string{suite.m365.User.ID}
handlers = BackupHandlers(suite.ac) handlers = BackupHandlers(suite.m365.AC)
) )
tests := []struct { tests := []struct {
@ -911,14 +899,14 @@ func (suite *BackupIntgSuite) TestEventsSerializationRegression() {
bpc := inject.BackupProducerConfig{ bpc := inject.BackupProducerConfig{
LastBackupVersion: version.NoBackup, LastBackupVersion: version.NoBackup,
Options: control.DefaultOptions(), Options: control.DefaultOptions(),
ProtectedResource: inMock.NewProvider(suite.user, suite.user), ProtectedResource: suite.m365.User.Provider,
} }
collections, err := CreateCollections( collections, err := CreateCollections(
ctx, ctx,
bpc, bpc,
handlers, handlers,
suite.tenantID, suite.m365.TenantID,
test.scope, test.scope,
metadata.DeltaPaths{}, metadata.DeltaPaths{},
newStatusUpdater(t, &wg), newStatusUpdater(t, &wg),

View File

@ -19,6 +19,7 @@ import (
"github.com/alcionai/corso/src/internal/m365/support" "github.com/alcionai/corso/src/internal/m365/support"
"github.com/alcionai/corso/src/internal/observe" "github.com/alcionai/corso/src/internal/observe"
"github.com/alcionai/corso/src/pkg/backup/details" "github.com/alcionai/corso/src/pkg/backup/details"
"github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/count" "github.com/alcionai/corso/src/pkg/count"
"github.com/alcionai/corso/src/pkg/errs/core" "github.com/alcionai/corso/src/pkg/errs/core"
"github.com/alcionai/corso/src/pkg/fault" "github.com/alcionai/corso/src/pkg/fault"
@ -68,21 +69,21 @@ func getItemAndInfo(
ctx context.Context, ctx context.Context,
getter itemGetterSerializer, getter itemGetterSerializer,
userID string, userID string,
id string, itemID string,
useImmutableIDs bool, useImmutableIDs bool,
parentPath string, parentPath string,
) ([]byte, *details.ExchangeInfo, error) { ) ([]byte, *details.ExchangeInfo, error) {
item, info, err := getter.GetItem( item, info, err := getter.GetItem(
ctx, ctx,
userID, userID,
id, itemID,
fault.New(true)) // temporary way to force a failFast error fault.New(true)) // temporary way to force a failFast error
if err != nil { if err != nil {
return nil, nil, clues.WrapWC(ctx, err, "fetching item"). return nil, nil, clues.WrapWC(ctx, err, "fetching item").
Label(fault.LabelForceNoBackupCreation) Label(fault.LabelForceNoBackupCreation)
} }
itemData, err := getter.Serialize(ctx, item, userID, id) itemData, err := getter.Serialize(ctx, item, userID, itemID)
if err != nil { if err != nil {
return nil, nil, clues.WrapWC(ctx, err, "serializing item") return nil, nil, clues.WrapWC(ctx, err, "serializing item")
} }
@ -108,6 +109,7 @@ func NewCollection(
bc data.BaseCollection, bc data.BaseCollection,
user string, user string,
items itemGetterSerializer, items itemGetterSerializer,
canSkipFailChecker canSkipItemFailurer,
origAdded map[string]time.Time, origAdded map[string]time.Time,
origRemoved []string, origRemoved []string,
validModTimes bool, validModTimes bool,
@ -140,6 +142,7 @@ func NewCollection(
added: added, added: added,
removed: removed, removed: removed,
getter: items, getter: items,
skipChecker: canSkipFailChecker,
statusUpdater: statusUpdater, statusUpdater: statusUpdater,
} }
} }
@ -150,6 +153,7 @@ func NewCollection(
added: added, added: added,
removed: removed, removed: removed,
getter: items, getter: items,
skipChecker: canSkipFailChecker,
statusUpdater: statusUpdater, statusUpdater: statusUpdater,
counter: counter, counter: counter,
} }
@ -167,7 +171,8 @@ type prefetchCollection struct {
// removed is a list of item IDs that were deleted from, or moved out, of a container // removed is a list of item IDs that were deleted from, or moved out, of a container
removed map[string]struct{} removed map[string]struct{}
getter itemGetterSerializer getter itemGetterSerializer
skipChecker canSkipItemFailurer
statusUpdater support.StatusUpdater statusUpdater support.StatusUpdater
} }
@ -194,11 +199,12 @@ func (col *prefetchCollection) streamItems(
wg sync.WaitGroup wg sync.WaitGroup
progressMessage chan<- struct{} progressMessage chan<- struct{}
user = col.user user = col.user
dataCategory = col.Category().String()
) )
ctx = clues.Add( ctx = clues.Add(
ctx, ctx,
"category", col.Category().String()) "category", dataCategory)
defer func() { defer func() {
close(stream) close(stream)
@ -227,7 +233,7 @@ func (col *prefetchCollection) streamItems(
defer close(semaphoreCh) defer close(semaphoreCh)
// delete all removed items // delete all removed items
for id := range col.removed { for itemID := range col.removed {
semaphoreCh <- struct{}{} semaphoreCh <- struct{}{}
wg.Add(1) wg.Add(1)
@ -247,7 +253,7 @@ func (col *prefetchCollection) streamItems(
if progressMessage != nil { if progressMessage != nil {
progressMessage <- struct{}{} progressMessage <- struct{}{}
} }
}(id) }(itemID)
} }
var ( var (
@ -256,7 +262,7 @@ func (col *prefetchCollection) streamItems(
) )
// add any new items // add any new items
for id := range col.added { for itemID := range col.added {
if el.Failure() != nil { if el.Failure() != nil {
break break
} }
@ -277,8 +283,23 @@ func (col *prefetchCollection) streamItems(
col.Opts().ToggleFeatures.ExchangeImmutableIDs, col.Opts().ToggleFeatures.ExchangeImmutableIDs,
parentPath) parentPath)
if err != nil { if err != nil {
// pulled outside the switch due to multiple return values.
cause, canSkip := col.skipChecker.CanSkipItemFailure(
err,
user,
col.Opts())
// Handle known error cases // Handle known error cases
switch { switch {
case canSkip:
// this is a special case handler that allows the item to be skipped
// instead of producing an error.
errs.AddSkip(ctx, fault.FileSkip(
cause,
dataCategory,
id,
id,
nil))
case errors.Is(err, core.ErrNotFound): case errors.Is(err, core.ErrNotFound):
// Don't report errors for deleted items as there's no way for us to // Don't report errors for deleted items as there's no way for us to
// back up data that is gone. Record it as a "success", since there's // back up data that is gone. Record it as a "success", since there's
@ -300,6 +321,19 @@ func (col *prefetchCollection) streamItems(
id, id,
map[string]any{"parentPath": parentPath})) map[string]any{"parentPath": parentPath}))
atomic.AddInt64(&success, 1) atomic.AddInt64(&success, 1)
case graph.IsErrCorruptData(err):
// These items cannot be downloaded, graph error indicates that the item
// data is corrupted. Add to skipped list.
logger.
CtxErr(ctx, err).
With("skipped_reason", fault.SkipCorruptData).
Info("inaccessible email")
errs.AddSkip(ctx, fault.EmailSkip(
fault.SkipCorruptData,
user,
id,
map[string]any{"parentPath": parentPath}))
atomic.AddInt64(&success, 1)
default: default:
col.Counter.Inc(count.StreamItemsErred) col.Counter.Inc(count.StreamItemsErred)
el.AddRecoverable(ctx, clues.Wrap(err, "fetching item").Label(fault.LabelForceNoBackupCreation)) el.AddRecoverable(ctx, clues.Wrap(err, "fetching item").Label(fault.LabelForceNoBackupCreation))
@ -336,7 +370,7 @@ func (col *prefetchCollection) streamItems(
if progressMessage != nil { if progressMessage != nil {
progressMessage <- struct{}{} progressMessage <- struct{}{}
} }
}(id) }(itemID)
} }
wg.Wait() wg.Wait()
@ -364,7 +398,8 @@ type lazyFetchCollection struct {
// removed is a list of item IDs that were deleted from, or moved out, of a container // removed is a list of item IDs that were deleted from, or moved out, of a container
removed map[string]struct{} removed map[string]struct{}
getter itemGetterSerializer getter itemGetterSerializer
skipChecker canSkipItemFailurer
statusUpdater support.StatusUpdater statusUpdater support.StatusUpdater
@ -391,8 +426,8 @@ func (col *lazyFetchCollection) streamItems(
var ( var (
success int64 success int64
progressMessage chan<- struct{} progressMessage chan<- struct{}
user = col.user
user = col.user el = errs.Local()
) )
defer func() { defer func() {
@ -404,7 +439,7 @@ func (col *lazyFetchCollection) streamItems(
int(success), int(success),
0, 0,
col.FullPath().Folder(false), col.FullPath().Folder(false),
errs.Failure()) el.Failure())
}() }()
if len(col.added)+len(col.removed) > 0 { if len(col.added)+len(col.removed) > 0 {
@ -430,7 +465,7 @@ func (col *lazyFetchCollection) streamItems(
// add any new items // add any new items
for id, modTime := range col.added { for id, modTime := range col.added {
if errs.Failure() != nil { if el.Failure() != nil {
break break
} }
@ -446,15 +481,18 @@ func (col *lazyFetchCollection) streamItems(
&lazyItemGetter{ &lazyItemGetter{
userID: user, userID: user,
itemID: id, itemID: id,
category: col.Category(),
getter: col.getter, getter: col.getter,
modTime: modTime, modTime: modTime,
immutableIDs: col.Opts().ToggleFeatures.ExchangeImmutableIDs, immutableIDs: col.Opts().ToggleFeatures.ExchangeImmutableIDs,
parentPath: parentPath, parentPath: parentPath,
skipChecker: col.skipChecker,
opts: col.Opts(),
}, },
id, id,
modTime, modTime,
col.counter, col.counter,
errs) el)
atomic.AddInt64(&success, 1) atomic.AddInt64(&success, 1)
@ -468,9 +506,12 @@ type lazyItemGetter struct {
getter itemGetterSerializer getter itemGetterSerializer
userID string userID string
itemID string itemID string
category path.CategoryType
parentPath string parentPath string
modTime time.Time modTime time.Time
immutableIDs bool immutableIDs bool
skipChecker canSkipItemFailurer
opts control.Options
} }
func (lig *lazyItemGetter) GetData( func (lig *lazyItemGetter) GetData(
@ -485,6 +526,25 @@ func (lig *lazyItemGetter) GetData(
lig.immutableIDs, lig.immutableIDs,
lig.parentPath) lig.parentPath)
if err != nil { if err != nil {
if lig.skipChecker != nil {
cause, canSkip := lig.skipChecker.CanSkipItemFailure(
err,
lig.userID,
lig.opts)
if canSkip {
errs.AddSkip(ctx, fault.FileSkip(
cause,
lig.category.String(),
lig.itemID,
lig.itemID,
nil))
return nil, nil, false, clues.
NewWC(ctx, "error marked as skippable by handler").
Label(graph.LabelsSkippable)
}
}
// If an item was deleted then return an empty file so we don't fail // If an item was deleted then return an empty file so we don't fail
// the backup and return a sentinel error when asked for ItemInfo so // the backup and return a sentinel error when asked for ItemInfo so
// we don't display the item in the backup. // we don't display the item in the backup.
@ -499,7 +559,7 @@ func (lig *lazyItemGetter) GetData(
err = clues.Stack(err) err = clues.Stack(err)
errs.AddRecoverable(ctx, err) errs.AddRecoverable(ctx, err)
return nil, nil, false, err return nil, nil, false, clues.Stack(err)
} }
// Update the mod time to what we already told kopia about. This is required // Update the mod time to what we already told kopia about. This is required

View File

@ -28,6 +28,7 @@ import (
"github.com/alcionai/corso/src/pkg/errs/core" "github.com/alcionai/corso/src/pkg/errs/core"
"github.com/alcionai/corso/src/pkg/fault" "github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/path" "github.com/alcionai/corso/src/pkg/path"
"github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
graphTD "github.com/alcionai/corso/src/pkg/services/m365/api/graph/testdata" graphTD "github.com/alcionai/corso/src/pkg/services/m365/api/graph/testdata"
) )
@ -153,6 +154,7 @@ func (suite *CollectionUnitSuite) TestNewCollection_state() {
count.New()), count.New()),
"u", "u",
mock.DefaultItemGetSerialize(), mock.DefaultItemGetSerialize(),
mock.NeverCanSkipFailChecker(),
nil, nil,
nil, nil,
colType.validModTimes, colType.validModTimes,
@ -298,6 +300,7 @@ func (suite *CollectionUnitSuite) TestPrefetchCollection_Items() {
count.New()), count.New()),
"", "",
&mock.ItemGetSerialize{}, &mock.ItemGetSerialize{},
mock.NeverCanSkipFailChecker(),
test.added, test.added,
maps.Keys(test.removed), maps.Keys(test.removed),
false, false,
@ -333,6 +336,232 @@ func (suite *CollectionUnitSuite) TestPrefetchCollection_Items() {
} }
} }
func (suite *CollectionUnitSuite) TestPrefetchCollection_Items_skipFailure() {
var (
start = time.Now().Add(-time.Second)
statusUpdater = func(*support.ControllerOperationStatus) {}
)
table := []struct {
name string
category path.CategoryType
handler backupHandler
added map[string]time.Time
removed map[string]struct{}
expectItemCount int
expectSkippedCount int
expectErr assert.ErrorAssertionFunc
}{
{
name: "no items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
expectErr: assert.NoError,
},
{
name: "events only added items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
added: map[string]time.Time{
"fisher": {},
"flannigan": {},
"fitzbog": {},
},
expectItemCount: 0,
expectSkippedCount: 3,
expectErr: assert.NoError,
},
{
name: "events only removed items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
removed: map[string]struct{}{
"princess": {},
"poppy": {},
"petunia": {},
},
expectItemCount: 3,
expectSkippedCount: 0,
expectErr: assert.NoError,
},
{
name: "events added and removed items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
added: map[string]time.Time{
"general": {},
},
removed: map[string]struct{}{
"general": {},
"goose": {},
"grumbles": {},
},
expectItemCount: 3,
// not 1, because general is removed from the added
// map due to being in the removed map
expectSkippedCount: 0,
expectErr: assert.NoError,
},
{
name: "contacts only added items",
category: path.ContactsCategory,
handler: newContactBackupHandler(api.Client{}),
added: map[string]time.Time{
"fisher": {},
"flannigan": {},
"fitzbog": {},
},
expectItemCount: 0,
expectSkippedCount: 0,
expectErr: assert.Error,
},
{
name: "contacts only removed items",
category: path.ContactsCategory,
handler: newContactBackupHandler(api.Client{}),
removed: map[string]struct{}{
"princess": {},
"poppy": {},
"petunia": {},
},
expectItemCount: 3,
expectSkippedCount: 0,
expectErr: assert.NoError,
},
{
name: "contacts added and removed items",
category: path.ContactsCategory,
handler: newContactBackupHandler(api.Client{}),
added: map[string]time.Time{
"general": {},
},
removed: map[string]struct{}{
"general": {},
"goose": {},
"grumbles": {},
},
expectItemCount: 3,
// not 1, because general is removed from the added
// map due to being in the removed map
expectSkippedCount: 0,
expectErr: assert.NoError,
},
{
name: "mail only added items",
category: path.EmailCategory,
handler: newMailBackupHandler(api.Client{}),
added: map[string]time.Time{
"fisher": {},
"flannigan": {},
"fitzbog": {},
},
expectItemCount: 0,
expectSkippedCount: 0,
expectErr: assert.Error,
},
{
name: "mail only removed items",
category: path.EmailCategory,
handler: newMailBackupHandler(api.Client{}),
removed: map[string]struct{}{
"princess": {},
"poppy": {},
"petunia": {},
},
expectItemCount: 3,
expectSkippedCount: 0,
expectErr: assert.NoError,
},
{
name: "mail added and removed items",
category: path.EmailCategory,
handler: newMailBackupHandler(api.Client{}),
added: map[string]time.Time{
"general": {},
},
removed: map[string]struct{}{
"general": {},
"goose": {},
"grumbles": {},
},
expectItemCount: 3,
// not 1, because general is removed from the added
// map due to being in the removed map
expectSkippedCount: 0,
expectErr: assert.NoError,
},
}
for _, test := range table {
suite.Run(test.name, func() {
var (
t = suite.T()
errs = fault.New(true)
itemCount int
)
ctx, flush := tester.NewContext(t)
defer flush()
fullPath, err := path.Build("t", "pr", path.ExchangeService, test.category, false, "fnords", "smarf")
require.NoError(t, err, clues.ToCore(err))
locPath, err := path.Build("t", "pr", path.ExchangeService, test.category, false, "fnords", "smarf")
require.NoError(t, err, clues.ToCore(err))
opts := control.DefaultOptions()
opts.SkipEventsOnInstance503ForResources = map[string]struct{}{}
opts.SkipEventsOnInstance503ForResources["pr"] = struct{}{}
col := NewCollection(
data.NewBaseCollection(
fullPath,
nil,
locPath.ToBuilder(),
opts,
false,
count.New()),
"pr",
&mock.ItemGetSerialize{
SerializeErr: graph.ErrServiceUnavailableEmptyResp,
},
test.handler,
test.added,
maps.Keys(test.removed),
false,
statusUpdater,
count.New())
for item := range col.Items(ctx, errs) {
itemCount++
_, rok := test.removed[item.ID()]
if rok {
dimt, ok := item.(data.ItemModTime)
require.True(t, ok, "item implements data.ItemModTime")
assert.True(t, dimt.ModTime().After(start), "deleted items should set mod time to now()")
assert.True(t, item.Deleted(), "removals should be marked as deleted")
}
_, aok := test.added[item.ID()]
if !rok && aok {
assert.False(t, item.Deleted(), "additions should not be marked as deleted")
}
assert.True(t, aok || rok, "item must be either added or removed: %q", item.ID())
}
test.expectErr(t, errs.Failure())
assert.Equal(
t,
test.expectItemCount,
itemCount,
"should see all expected items")
assert.Len(t, errs.Skipped(), test.expectSkippedCount)
})
}
}
// This test verifies skipped error cases are handled correctly by collection enumeration // This test verifies skipped error cases are handled correctly by collection enumeration
func (suite *CollectionUnitSuite) TestCollection_SkippedErrors() { func (suite *CollectionUnitSuite) TestCollection_SkippedErrors() {
var ( var (
@ -364,6 +593,17 @@ func (suite *CollectionUnitSuite) TestCollection_SkippedErrors() {
}, },
expectedSkipError: fault.EmailSkip(fault.SkipInvalidRecipients, "", "fisher", nil), expectedSkipError: fault.EmailSkip(fault.SkipInvalidRecipients, "", "fisher", nil),
}, },
{
name: "ErrorCorruptData",
added: map[string]time.Time{
"fisher": {},
},
expectItemCount: 0,
itemGetter: &mock.ItemGetSerialize{
GetErr: graphTD.ODataErr(string(graph.ErrorCorruptData)),
},
expectedSkipError: fault.EmailSkip(fault.SkipCorruptData, "", "fisher", nil),
},
} }
for _, test := range table { for _, test := range table {
@ -387,6 +627,7 @@ func (suite *CollectionUnitSuite) TestCollection_SkippedErrors() {
count.New()), count.New()),
"", "",
test.itemGetter, test.itemGetter,
mock.NeverCanSkipFailChecker(),
test.added, test.added,
nil, nil,
false, false,
@ -467,6 +708,7 @@ func (suite *CollectionUnitSuite) TestLazyFetchCollection_Items_LazyFetch() {
expectItemCount: 3, expectItemCount: 3,
expectReads: []string{ expectReads: []string{
"fisher", "fisher",
"flannigan",
"fitzbog", "fitzbog",
}, },
}, },
@ -519,6 +761,7 @@ func (suite *CollectionUnitSuite) TestLazyFetchCollection_Items_LazyFetch() {
count.New()), count.New()),
"", "",
mlg, mlg,
mock.NeverCanSkipFailChecker(),
test.added, test.added,
maps.Keys(test.removed), maps.Keys(test.removed),
true, true,
@ -530,10 +773,10 @@ func (suite *CollectionUnitSuite) TestLazyFetchCollection_Items_LazyFetch() {
_, rok := test.removed[item.ID()] _, rok := test.removed[item.ID()]
if rok { if rok {
assert.True(t, item.Deleted(), "removals should be marked as deleted")
dimt, ok := item.(data.ItemModTime) dimt, ok := item.(data.ItemModTime)
require.True(t, ok, "item implements data.ItemModTime") require.True(t, ok, "item implements data.ItemModTime")
assert.True(t, dimt.ModTime().After(start), "deleted items should set mod time to now()") assert.True(t, dimt.ModTime().After(start), "deleted items should set mod time to now()")
assert.True(t, item.Deleted(), "removals should be marked as deleted")
} }
modTime, aok := test.added[item.ID()] modTime, aok := test.added[item.ID()]
@ -542,7 +785,6 @@ func (suite *CollectionUnitSuite) TestLazyFetchCollection_Items_LazyFetch() {
// initializer. // initializer.
assert.Implements(t, (*data.ItemModTime)(nil), item) assert.Implements(t, (*data.ItemModTime)(nil), item)
assert.Equal(t, modTime, item.(data.ItemModTime).ModTime(), "item mod time") assert.Equal(t, modTime, item.(data.ItemModTime).ModTime(), "item mod time")
assert.False(t, item.Deleted(), "additions should not be marked as deleted") assert.False(t, item.Deleted(), "additions should not be marked as deleted")
// Check if the test want's us to read the item's data so the lazy // Check if the test want's us to read the item's data so the lazy
@ -562,6 +804,8 @@ func (suite *CollectionUnitSuite) TestLazyFetchCollection_Items_LazyFetch() {
// collection initializer. // collection initializer.
assert.NoError(t, err, clues.ToCore(err)) assert.NoError(t, err, clues.ToCore(err))
assert.Equal(t, modTime, info.Modified(), "ItemInfo mod time") assert.Equal(t, modTime, info.Modified(), "ItemInfo mod time")
} else {
assert.Fail(t, "unexpected read on item %s", item.ID())
} }
} }
@ -578,6 +822,294 @@ func (suite *CollectionUnitSuite) TestLazyFetchCollection_Items_LazyFetch() {
} }
} }
func (suite *CollectionUnitSuite) TestLazyFetchCollection_Items_skipFailure() {
var (
start = time.Now().Add(-time.Second)
statusUpdater = func(*support.ControllerOperationStatus) {}
expectSkip = func(t *testing.T, err error) {
assert.Error(t, err, clues.ToCore(err))
assert.ErrorContains(t, err, "skip")
assert.True(t, clues.HasLabel(err, graph.LabelsSkippable), clues.ToCore(err))
}
expectNotSkipped = func(t *testing.T, err error) {
assert.Error(t, err, clues.ToCore(err))
assert.NotContains(t, err.Error(), "skip")
}
)
table := []struct {
name string
added map[string]time.Time
removed map[string]struct{}
category path.CategoryType
handler backupHandler
expectItemCount int
expectSkippedCount int
expectReads []string
expectErr func(t *testing.T, err error)
expectFailure assert.ErrorAssertionFunc
}{
{
name: "no items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
expectFailure: assert.NoError,
},
{
name: "events only added items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
added: map[string]time.Time{
"fisher": start.Add(time.Minute),
"flannigan": start.Add(2 * time.Minute),
"fitzbog": start.Add(3 * time.Minute),
},
expectItemCount: 3,
expectSkippedCount: 3,
expectReads: []string{
"fisher",
"flannigan",
"fitzbog",
},
expectErr: expectSkip,
expectFailure: assert.NoError,
},
{
name: "events only removed items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
removed: map[string]struct{}{
"princess": {},
"poppy": {},
"petunia": {},
},
expectItemCount: 3,
expectSkippedCount: 0,
expectErr: expectSkip,
expectFailure: assert.NoError,
},
{
name: "events added and removed items",
category: path.EventsCategory,
handler: newEventBackupHandler(api.Client{}),
added: map[string]time.Time{
"general": {},
},
removed: map[string]struct{}{
"general": {},
"goose": {},
"grumbles": {},
},
expectItemCount: 3,
// not 1, because general is removed from the added
// map due to being in the removed map
expectSkippedCount: 0,
expectErr: expectSkip,
expectFailure: assert.NoError,
},
{
name: "contacts only added items",
category: path.ContactsCategory,
handler: newContactBackupHandler(api.Client{}),
added: map[string]time.Time{
"fisher": start.Add(time.Minute),
"flannigan": start.Add(2 * time.Minute),
"fitzbog": start.Add(3 * time.Minute),
},
expectItemCount: 3,
expectSkippedCount: 0,
expectReads: []string{
"fisher",
"flannigan",
"fitzbog",
},
expectErr: expectNotSkipped,
expectFailure: assert.Error,
},
{
name: "contacts only removed items",
category: path.ContactsCategory,
handler: newContactBackupHandler(api.Client{}),
removed: map[string]struct{}{
"princess": {},
"poppy": {},
"petunia": {},
},
expectItemCount: 3,
expectSkippedCount: 0,
expectErr: expectNotSkipped,
expectFailure: assert.NoError,
},
{
name: "contacts added and removed items",
category: path.ContactsCategory,
handler: newContactBackupHandler(api.Client{}),
added: map[string]time.Time{
"general": {},
},
removed: map[string]struct{}{
"general": {},
"goose": {},
"grumbles": {},
},
expectItemCount: 3,
// not 1, because general is removed from the added
// map due to being in the removed map
expectSkippedCount: 0,
expectErr: expectNotSkipped,
expectFailure: assert.NoError,
},
{
name: "mail only added items",
category: path.EmailCategory,
handler: newMailBackupHandler(api.Client{}),
added: map[string]time.Time{
"fisher": start.Add(time.Minute),
"flannigan": start.Add(2 * time.Minute),
"fitzbog": start.Add(3 * time.Minute),
},
expectItemCount: 3,
expectSkippedCount: 0,
expectReads: []string{
"fisher",
"flannigan",
"fitzbog",
},
expectErr: expectNotSkipped,
expectFailure: assert.Error,
},
{
name: "mail only removed items",
category: path.EmailCategory,
handler: newMailBackupHandler(api.Client{}),
removed: map[string]struct{}{
"princess": {},
"poppy": {},
"petunia": {},
},
expectItemCount: 3,
expectSkippedCount: 0,
expectErr: expectNotSkipped,
expectFailure: assert.NoError,
},
{
name: "mail added and removed items",
category: path.EmailCategory,
handler: newMailBackupHandler(api.Client{}),
added: map[string]time.Time{
"general": {},
},
removed: map[string]struct{}{
"general": {},
"goose": {},
"grumbles": {},
},
expectItemCount: 3,
// not 1, because general is removed from the added
// map due to being in the removed map
expectSkippedCount: 0,
expectErr: expectNotSkipped,
expectFailure: assert.NoError,
},
}
for _, test := range table {
suite.Run(test.name, func() {
var (
t = suite.T()
errs = fault.New(false)
itemCount int
)
ctx, flush := tester.NewContext(t)
defer flush()
fullPath, err := path.Build("t", "pr", path.ExchangeService, test.category, false, "fnords", "smarf")
require.NoError(t, err, clues.ToCore(err))
locPath, err := path.Build("t", "pr", path.ExchangeService, test.category, false, "fnords", "smarf")
require.NoError(t, err, clues.ToCore(err))
mlg := &mockLazyItemGetterSerializer{
ItemGetSerialize: &mock.ItemGetSerialize{
SerializeErr: graph.ErrServiceUnavailableEmptyResp,
},
}
defer mlg.check(t, test.expectReads)
opts := control.DefaultOptions()
opts.SkipEventsOnInstance503ForResources = map[string]struct{}{}
opts.SkipEventsOnInstance503ForResources["pr"] = struct{}{}
col := NewCollection(
data.NewBaseCollection(
fullPath,
nil,
locPath.ToBuilder(),
opts,
false,
count.New()),
"pr",
mlg,
test.handler,
test.added,
maps.Keys(test.removed),
true,
statusUpdater,
count.New())
for item := range col.Items(ctx, errs) {
itemCount++
_, rok := test.removed[item.ID()]
if rok {
dimt, ok := item.(data.ItemModTime)
require.True(t, ok, "item implements data.ItemModTime")
assert.True(t, dimt.ModTime().After(start), "deleted items should set mod time to now()")
assert.True(t, item.Deleted(), "removals should be marked as deleted")
}
modTime, aok := test.added[item.ID()]
if !rok && aok {
// Item's mod time should be what's passed into the collection
// initializer.
assert.Implements(t, (*data.ItemModTime)(nil), item)
assert.Equal(t, modTime, item.(data.ItemModTime).ModTime(), "item mod time")
assert.False(t, item.Deleted(), "additions should not be marked as deleted")
// Check if the test want's us to read the item's data so the lazy
// data fetch is executed.
if slices.Contains(test.expectReads, item.ID()) {
r := item.ToReader()
_, err := io.ReadAll(r)
test.expectErr(t, err)
r.Close()
} else {
assert.Fail(t, "unexpected read on item %s", item.ID())
}
}
assert.True(t, aok || rok, "item must be either added or removed: %q", item.ID())
}
failure := errs.Failure()
if failure == nil && len(errs.Recovered()) > 0 {
failure = errs.Recovered()[0]
}
test.expectFailure(t, failure, clues.ToCore(failure))
assert.Equal(
t,
test.expectItemCount,
itemCount,
"should see all expected items")
assert.Len(t, errs.Skipped(), test.expectSkippedCount)
})
}
}
func (suite *CollectionUnitSuite) TestLazyItem_NoRead_GetInfo_Errors() { func (suite *CollectionUnitSuite) TestLazyItem_NoRead_GetInfo_Errors() {
t := suite.T() t := suite.T()

View File

@ -1,6 +1,8 @@
package exchange package exchange
import ( import (
"github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/services/m365/api" "github.com/alcionai/corso/src/pkg/services/m365/api"
"github.com/alcionai/corso/src/pkg/services/m365/api/graph" "github.com/alcionai/corso/src/pkg/services/m365/api/graph"
) )
@ -52,3 +54,11 @@ func (h contactBackupHandler) NewContainerCache(
getter: h.ac, getter: h.ac,
} }
} }
func (h contactBackupHandler) CanSkipItemFailure(
err error,
resourceID string,
opts control.Options,
) (fault.SkipCause, bool) {
return "", false
}

View File

@ -0,0 +1,83 @@
package exchange
import (
"testing"
"github.com/google/uuid"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/suite"
"github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/fault"
"github.com/alcionai/corso/src/pkg/services/m365/api"
)
type ContactsBackupHandlerUnitSuite struct {
tester.Suite
}
func TestContactsBackupHandlerUnitSuite(t *testing.T) {
suite.Run(t, &ContactsBackupHandlerUnitSuite{Suite: tester.NewUnitSuite(t)})
}
func (suite *ContactsBackupHandlerUnitSuite) TestHandler_CanSkipItemFailure() {
resourceID := uuid.NewString()
table := []struct {
name string
err error
opts control.Options
expect assert.BoolAssertionFunc
expectCause fault.SkipCause
}{
{
name: "no config",
err: assert.AnError,
opts: control.Options{},
expect: assert.False,
},
{
name: "false when map is empty",
err: assert.AnError,
opts: control.Options{
SkipEventsOnInstance503ForResources: map[string]struct{}{},
},
expect: assert.False,
},
{
name: "false on nil error",
err: nil,
opts: control.Options{
SkipEventsOnInstance503ForResources: map[string]struct{}{
resourceID: {},
},
},
expect: assert.False,
},
{
name: "false even if resource matches",
err: assert.AnError,
opts: control.Options{
SkipEventsOnInstance503ForResources: map[string]struct{}{
resourceID: {},
},
},
expect: assert.False,
},
}
for _, test := range table {
suite.Run(test.name, func() {
t := suite.T()
h := newContactBackupHandler(api.Client{})
cause, result := h.CanSkipItemFailure(
test.err,
resourceID,
test.opts)
test.expect(t, result)
assert.Equal(t, test.expectCause, cause)
})
}
}

View File

@ -126,7 +126,7 @@ func (cfc *contactContainerCache) Populate(
if err != nil { if err != nil {
errs.AddRecoverable( errs.AddRecoverable(
ctx, ctx,
graph.Stack(ctx, err).Label(fault.LabelForceNoBackupCreation)) clues.StackWC(ctx, err).Label(fault.LabelForceNoBackupCreation))
} }
} }

View File

@ -120,7 +120,7 @@ func restoreContact(
) (*details.ExchangeInfo, error) { ) (*details.ExchangeInfo, error) {
contact, err := api.BytesToContactable(body) contact, err := api.BytesToContactable(body)
if err != nil { if err != nil {
return nil, graph.Wrap(ctx, err, "creating contact from bytes") return nil, clues.WrapWC(ctx, err, "creating contact from bytes")
} }
ctx = clues.Add(ctx, "item_id", ptr.Val(contact.GetId())) ctx = clues.Add(ctx, "item_id", ptr.Val(contact.GetId()))
@ -148,7 +148,7 @@ func restoreContact(
item, err := cr.PostItem(ctx, userID, destinationID, contact) item, err := cr.PostItem(ctx, userID, destinationID, contact)
if err != nil { if err != nil {
return nil, graph.Wrap(ctx, err, "restoring contact") return nil, clues.Wrap(err, "restoring contact")
} }
// contacts have no PUT request, and PATCH could retain data that's not // contacts have no PUT request, and PATCH could retain data that's not
@ -159,7 +159,7 @@ func restoreContact(
if shouldDeleteOriginal { if shouldDeleteOriginal {
err := cr.DeleteItem(ctx, userID, collisionID) err := cr.DeleteItem(ctx, userID, collisionID)
if err != nil && !errors.Is(err, core.ErrNotFound) { if err != nil && !errors.Is(err, core.ErrNotFound) {
return nil, graph.Wrap(ctx, err, "deleting colliding contact") return nil, clues.Wrap(err, "deleting colliding contact")
} }
} }

View File

@ -12,6 +12,7 @@ import (
"github.com/alcionai/corso/src/internal/m365/service/exchange/mock" "github.com/alcionai/corso/src/internal/m365/service/exchange/mock"
"github.com/alcionai/corso/src/internal/tester" "github.com/alcionai/corso/src/internal/tester"
"github.com/alcionai/corso/src/internal/tester/its"
"github.com/alcionai/corso/src/internal/tester/tconfig" "github.com/alcionai/corso/src/internal/tester/tconfig"
"github.com/alcionai/corso/src/pkg/control" "github.com/alcionai/corso/src/pkg/control"
"github.com/alcionai/corso/src/pkg/control/testdata" "github.com/alcionai/corso/src/pkg/control/testdata"
@ -54,7 +55,7 @@ func (m *contactRestoreMock) DeleteItem(
type ContactsRestoreIntgSuite struct { type ContactsRestoreIntgSuite struct {
tester.Suite tester.Suite
its intgTesterSetup m365 its.M365IntgTestSetup
} }
func TestContactsRestoreIntgSuite(t *testing.T) { func TestContactsRestoreIntgSuite(t *testing.T) {
@ -66,17 +67,17 @@ func TestContactsRestoreIntgSuite(t *testing.T) {
} }
func (suite *ContactsRestoreIntgSuite) SetupSuite() { func (suite *ContactsRestoreIntgSuite) SetupSuite() {
suite.its = newIntegrationTesterSetup(suite.T()) suite.m365 = its.GetM365(suite.T())
} }
// Testing to ensure that cache system works for in multiple different environments // Testing to ensure that cache system works for in multiple different environments
func (suite *ContactsRestoreIntgSuite) TestCreateContainerDestination() { func (suite *ContactsRestoreIntgSuite) TestCreateContainerDestination() {
runCreateDestinationTest( runCreateDestinationTest(
suite.T(), suite.T(),
newContactRestoreHandler(suite.its.ac), newContactRestoreHandler(suite.m365.AC),
path.ContactsCategory, path.ContactsCategory,
suite.its.creds.AzureTenantID, suite.m365.TenantID,
suite.its.userID, suite.m365.User.ID,
testdata.DefaultRestoreConfig("").Location, testdata.DefaultRestoreConfig("").Location,
[]string{"Hufflepuff"}, []string{"Hufflepuff"},
[]string{"Ravenclaw"}) []string{"Ravenclaw"})
@ -207,17 +208,16 @@ func (suite *ContactsRestoreIntgSuite) TestRestoreContact() {
for _, test := range table { for _, test := range table {
suite.Run(test.name, func() { suite.Run(test.name, func() {
t := suite.T() t := suite.T()
ctr := count.New()
ctx, flush := tester.NewContext(t) ctx, flush := tester.NewContext(t)
defer flush() defer flush()
ctr := count.New()
_, err := restoreContact( _, err := restoreContact(
ctx, ctx,
test.apiMock, test.apiMock,
body, body,
suite.its.userID, suite.m365.User.ID,
"destination", "destination",
test.collisionMap, test.collisionMap,
test.onCollision, test.onCollision,

Some files were not shown because too many files have changed in this diff Show More