Merge pull request #771 from ing-bank/feat/providence

feat(providence): add providence package
This commit is contained in:
Joren Broekema 2020-06-25 13:49:27 +02:00 committed by GitHub
commit 8300e15547
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
102 changed files with 13411 additions and 658 deletions

View file

@ -3,6 +3,7 @@ module.exports = {
'*': ['eclint fix', 'git add'],
'*.js': ['eslint --fix', 'prettier --write', 'git add'],
'*.md': ['prettier --write', 'markdownlint', 'git add'],
'yarn.lock': ['node ./scripts/yarn-lock-scan.js'],
'*package.json': absolutePaths => {
const sortPackages = [];
absolutePaths.forEach(p => {

View file

@ -1,6 +1,6 @@
module.exports = {
hooks: {
'pre-commit': 'lint-staged && node ./scripts/yarn-lock-scan.js',
'pre-commit': 'lint-staged',
'commit-msg': 'commitlint -E HUSKY_GIT_PARAMS',
},
};

View file

@ -0,0 +1,3 @@
providence-output
providence-input-data
/.nyc_output

View file

@ -0,0 +1,210 @@
[//]: # 'AUTO INSERT HEADER PREPUBLISH'
# Providence
```js script
import { html } from 'lit-html';
import { providenceFlowSvg, providenceInternalFlowSvg } from './docs/_mermaid.svg.js';
export default {
title: 'Providence/Main',
};
```
Providence is the 'All Seeing Eye' that generates usage statistics by analyzing code.
It measures the effectivity and popularity of your software.
With just a few commands you can measure the impact for (breaking) changes, making
your release process more stable and predictable.
Providence can be used as a dev dependency in a project for which metrics
can be generated via analyzers (see below).
For instance for a repo "lion-based-ui" that extends @lion/\* we can answer questions like:
- **Which subsets of my product are popular?**
Which exports of reference project @lion/form-core are consumed by target project "lion-based-ui"?
- **How do sub classers consume/override my product?**
Wich classes / webcomponents inside target project "lion-based-ui" extend from reference project @lion/\*?
Which of the methods within those classes are overridden?
- etc...
All the above results can be shown in a dashboard (see below), which allows to sort exports from reference
project (@lion) based on popularity, category, consumer etc.
The dashboard allows to aggregate data from many target projects as well and will show you on a
detailed (file) level how those components are being consumed by which projects.
## Setup
### Install providence
```sh
npm i --save-dev providence-analytics
```
### Add a providence script to package.json
```js
...
"scripts": {
...
"providence": "providence analyze match-imports -r 'node_modules/@lion/*'",
}
```
> The example above illustrates how to run the "match-imports" analyzer for reference project 'lion-based-ui'. Note that it is possible to run other analyzers and configurations supported by providence as well. For a full overview of cli options, run `providence --help`. All supported analyzers will be viewed when running `providence analyze`
You are now ready to use providence in your project. All
data will be stored in json files in the folder `./providence-output`
```js story
export const runProvidence = () => html`
<img src=${new URL('./dev-assets/provicli.gif', import.meta.url)} alt="CLI" />
`;
```
## Setup: dashboard
### Add "providence:dashboard" script to package.json
```js
...
"scripts": {
...
"providence:dashboard": "node node_modules/providence/dashboard/src/server.js"
}
```
### Add providence.conf.js
```js
const providenceConfig = {
referenceCollections: {
'lion-based-ui collection': ['./node_modules/lion-based-ui'],
},
};
module.exports = providenceConfig;
```
Run `npm run providence:dashboard`
```js story
export const dashboard = () => html`
<img src=${new URL('./dev-assets/providash.gif', import.meta.url)} alt="dashboard" />
`;
```
## Setup: about result output
All output files will be stored in `./providence-output`.
This means they will be committed to git, so your colleagues don't have to
rerun the analysis (for large projects with many dependencies this can be time consuming)
and can directly start the dashboard usage metrics.
Also, note that the files serve as cache (they are stored with hashes based on project version and analyzer configuration). This means that an interrupted analysis can be
resumed later on.
## Conceptual overview
Providence performs queries on one or more search targets.
These search targets consist of one or more software projects (javascript/html repositories)
The diagram below shows how `providenceMain` function can be used from an external context.
```js story
export const providenceFlow = () => providenceFlowSvg;
```
## Flow inside providence
The diagram below depicts the flow inside the `providenceMain` function.
It uses:
- InputDataService
Used to create a data structure based on a folder (for instance the search target or
the references root). The structure creates entries for every file, which get enriched with code,
ast results, query results etc. Returns `InputData` object.
- QueryService
Requires a `queryConfig` and `InputData` object. It will perform a query (grep search or ast analysis)
and returns a `QueryResult`.
It also contains helpers for the creation of a `queryConfig`
- ReportService
The result gets outputted to the user. Currently, a log to the console and/or a dump to a json file
are available as output formats.
```js story
export const providenceInternalFlow = () => providenceInternalFlowSvg;
```
## Queries
Providence requires a queries as input.
Queries are defined as objects and can be of two types:
- feature-query
- analyzer
A `queryConfig` is required as input to run the `providenceMain` function.
This object specifies the type of query and contains the relevant meta
information that will later be outputted in the `QueryResult` (the JSON object that
the `providenceMain` function returns.)
## Analyzer Query
Analyzers queries are also created via `queryConfig`s.
Analyzers can be described as predefined queries that use AST traversal.
Run:
```sh
providence analyze
```
Now you will get a list of all predefined analyzers:
- find-imports
- find-exports
- match-imports
- find-subclasses
- etc...
```js story
export const analyzerQuery = () => html`
<img src=${new URL('./dev-assets/analyzer-query.gif', import.meta.url)} alt="Analyzer query" />
`;
```
## Running providence from its own repo
### How to add a new search target project
```sh
git submodule add <git-url> ./providence-input-data/search-targets/<project-name>
```
### How to add a reference project
By adding a reference project, you can automatically see how code in your reference project is
used across the search target projects.
Under the hood, this automatically creates a set of queries for you.
```sh
git submodule add <git-url> ./providence-input-data/references/<project-name>
```
### Updating submodules
Please run:
```sh
git submodule update --init --recursive
```
### Removing submodules
Please run:
```sh
sh ./rm-submodule.sh <path/to/submodule>
```

View file

@ -0,0 +1,214 @@
// eslint-disable-next-line max-classes-per-file
import { LitElement, html, css } from 'lit-element';
import { DecorateMixin } from '../../utils/DecorateMixin.js';
export class PTable extends DecorateMixin(LitElement) {
static get properties() {
return {
mobile: {
reflect: true,
type: Boolean,
},
data: Object,
// Sorted, sliced data, based on user interaction
_viewData: Object,
};
}
static get styles() {
return [
super.styles,
css`
/**
* Structural css
*/
[role='row'] {
display: flex;
}
[role='cell'],
[role='columnheader'] {
flex: 1;
}
[role='columnheader'] {
font-weight: bold;
}
.c-table__cell__header {
display: none;
}
.c-table__head {
background-color: var(--header-bg-color);
color: var(--header-color);
}
.c-table[mobile] .c-table__head {
display: none;
}
.c-table[mobile] .c-table__row {
flex-direction: column;
}
.c-table[mobile] .c-table__cell {
display: flex;
}
.c-table[mobile] .c-table__cell__header,
.c-table[mobile] .c-table__cell__text {
flex: 1;
}
.c-table[mobile] .c-table__cell__header {
display: block;
background-color: var(--header-bg-color);
color: var(--header-color);
}
`,
];
}
// eslint-disable-next-line class-methods-use-this
_descTemplate() {
return html` <span aria-label="descending">&#x25BC;</span> `;
}
// eslint-disable-next-line class-methods-use-this
_ascTemplate() {
return html` <span aria-label="ascending">&#x25B2;</span> `;
}
_mainTemplate(headers, sortMap, data, m) {
if (!(headers && sortMap && data)) {
return html``;
}
return html`
<div role="table" class="c-table" ?mobile=${m}>
<div role="rowgroup" class="c-table__head">
<div role="row" class="c-table__row c-table__columnheader-wrapper">
${headers.map(
header => html`
<div role="columnheader" class="c-table__columnheader">
<button @click="${() => this._sortBy(header)}" class="c-table__sort-button">
${header}
<span class="c-table__sort-indicator">
${sortMap[header] === 'desc' ? this._descTemplate() : this._ascTemplate()}
</span>
</button>
</div>
`,
)}
</div>
</div>
<div role="rowgroup" class="c-table__body">
${data.map(
row => html`
<div role="${m ? 'presentation' : 'row'}" class="c-table__row">
${headers.map(
header => html`
<div role="${m ? 'row' : 'cell'}" class="c-table__cell">
<span
id="item1"
role="${m ? 'rowheader' : 'presentation'}"
class="c-table__cell__header"
>
${header}
</span>
<span role="${m ? 'cell' : 'presentation'}" class="c-table__cell__text">
${this.renderCellContent(row[header], header)}
</span>
</div>
`,
)}
</div>
`,
)}
</div>
</div>
`;
}
render() {
return this._mainTemplate(
this._viewDataHeaders,
this.__viewDataSortMap,
this._viewData,
this.mobile,
);
}
constructor() {
super();
this.__viewDataSortMap = {};
}
connectedCallback() {
if (super.connectedCallback) {
super.connectedCallback();
}
const mql = window.matchMedia('(max-width: 767px)');
this.mobile = mql.matches;
mql.addListener(({ matches }) => {
this.mobile = matches;
});
}
updated(changedProperties) {
super.updated(changedProperties);
if (changedProperties.has('data')) {
this.__computeViewData(this.data);
}
}
/**
* @overridable
* @param {string} content
* @param {string} header
*/
// eslint-disable-next-line class-methods-use-this, no-unused-vars
renderCellContent(content, header) {
return content;
}
__computeViewData(newData) {
this._viewData = [...newData];
this._viewDataHeaders = Object.keys(newData[0]);
}
_sortBy(specifier) {
this.__setSortMapValue(specifier);
const comparison = (a, b) => {
if (this.__viewDataSortMap[specifier] === 'desc') {
return b[specifier] > a[specifier];
}
return b[specifier] < a[specifier];
};
this._viewData.sort((a, b) => {
if (comparison(a, b)) {
return 1;
}
if (b[specifier] === a[specifier]) {
return 0;
}
return -1;
});
this.__computeViewData(this._viewData);
}
__setSortMapValue(specifier) {
// initialize to desc first time
if (!this.__viewDataSortMap[specifier]) {
this.__viewDataSortMap[specifier] = 'desc';
} else {
const cur = this.__viewDataSortMap[specifier];
// Toggle asc / desc
this.__viewDataSortMap[specifier] = cur === 'desc' ? 'asc' : 'desc';
}
}
}

View file

@ -0,0 +1,434 @@
/* eslint-disable max-classes-per-file */
import { LitElement, html, css } from 'lit-element';
import { tooltip as tooltipStyles } from './styles/tooltip.css.js';
import { global as globalStyles } from './styles/global.css.js';
import { utils as utilsStyles } from './styles/utils.css.js';
import { tableDecoration } from './styles/tableDecoration.css.js';
import { GlobalDecorator } from './utils/GlobalDecorator.js';
import { DecorateMixin } from './utils/DecorateMixin.js';
import { downloadFile } from './utils/downloadFile.js';
import { PTable } from './components/p-table/PTable.js';
// Decorate third party component styles
GlobalDecorator.decorateStyles(globalStyles, { prepend: true });
PTable.decorateStyles(tableDecoration);
customElements.define('p-table', PTable);
function checkedValues(checkboxOrNodeList) {
if (!checkboxOrNodeList.length) {
return checkboxOrNodeList.checked && checkboxOrNodeList.value;
}
return Array.from(checkboxOrNodeList)
.filter(r => r.checked)
.map(r => r.value);
}
class PBoard extends DecorateMixin(LitElement) {
static get properties() {
return {
// Transformed data from fetch
tableData: Object,
__resultFiles: Array,
__menuData: Object,
};
}
static get styles() {
return [
super.styles,
utilsStyles,
tooltipStyles,
css`
p-table {
border: 1px solid gray;
display: block;
margin: 2px;
}
.heading {
font-size: 1.5em;
letter-spacing: 0.1em;
}
.heading__part {
color: var(--primary-color);
}
.menu-group {
display: flex;
flex-wrap: wrap;
flex-direction: column;
}
`,
];
}
/**
* @param {object} referenceCollections references defined in providence.conf.js Includes reference projects
* @param {object} searchTargetCollections programs defined in providence.conf.js. Includes search-target projects
* @param {object[]} projDeps deps retrieved by running providence, read from search-target-deps-file.json
*/
_selectionMenuTemplate(result) {
if (!result) {
return html``;
}
const { referenceCollections, searchTargetDeps } = result;
return html`
<test-table></test-table>
<form class="u-c-mv2" id="selection-menu-form" action="" @change="${this._aggregateResults}">
<fieldset>
<legend>References (grouped by collection)</legend>
${Object.keys(referenceCollections).map(
colName => html`
<div role="separator">${colName}</div>
${referenceCollections[colName].map(
refName => html`
<label
><input
type="checkbox"
name="references"
.checked=${colName === 'lion-based-ui'}
value="${refName}"
/>${refName}</label
>
`,
)}
`,
)}
</fieldset>
<fieldset>
<legend>Repositories (grouped by search target)</legend>
${Object.keys(searchTargetDeps).map(
rootProjName => html`
<details>
<summary>
<span class="u-bold">${rootProjName}</span>
<input
aria-label="check all"
type="checkbox"
checked
@change="${({ target }) => {
// TODO: of course, logic depending on dom is never a good idea
const groupBoxes = target.parentElement.nextElementSibling.querySelectorAll(
'input[type=checkbox]',
);
const { checked } = target;
Array.from(groupBoxes).forEach(box => {
// eslint-disable-next-line no-param-reassign
box.checked = checked;
});
}}"
/>
</summary>
<div class="menu-group">
${searchTargetDeps[rootProjName].map(
dep => html`
<label
><input
type="checkbox"
name="repos"
.checked="${dep}"
value="${dep}"
/>${dep}</label
>
`,
)}
</div>
<hr />
</details>
`,
)}
</fieldset>
</form>
`;
}
_activeAnalyzerSelectTemplate() {
return html`
<select id="active-analyzer">
${Object.keys(this.__resultFiles).map(
analyzerName => html` <option value="${analyzerName}">${analyzerName}</option> `,
)}
</select>
`;
}
get _selectionMenuFormNode() {
return this.shadowRoot.getElementById('selection-menu-form');
}
get _activeAnalyzerNode() {
return this.shadowRoot.getElementById('active-analyzer');
}
get _tableNode() {
return this.shadowRoot.querySelector('p-table');
}
_createCsv(headers = this._tableNode._viewDataHeaders, data = this._tableNode._viewData) {
let result = 'sep=;\n';
result += `${headers.join(';')}\n`;
data.forEach(row => {
result += `${Object.values(row)
.map(v => {
if (Array.isArray(v)) {
const res = [];
v.forEach(vv => {
// TODO: make recursive
if (typeof vv === 'string') {
res.push(vv);
} else {
// typeof v === 'object'
res.push(JSON.stringify(vv));
}
});
return res.join(', ');
}
if (typeof v === 'object') {
// This has knowledge about specifier.
// TODO make more generic and add toString() to this obj in generation pahse
return v.name;
}
return v;
})
.join(';')}\n`;
});
return result;
}
render() {
return html`
<div style="display:flex; align-items: baseline;">
<h1 class="heading">providence <span class="heading__part">dashboard</span> (alpha)</h1>
<div class="u-ml2">
${this._activeAnalyzerSelectTemplate()}
<button @click="${() => downloadFile('data.csv', this._createCsv())}">
get csv
</button>
</div>
</div>
${this._selectionMenuTemplate(this.__menuData)}
<p-table .data="${this.tableData}" class="u-mt3"></p-table>
`;
}
constructor() {
super();
this.__resultFiles = [];
this.__menuData = null;
}
firstUpdated(...args) {
super.firstUpdated(...args);
this._tableNode.renderCellContent = this._renderCellContent.bind(this);
this.__init();
}
async __init() {
await this.__fetchMenuData();
await this.__fetchResults();
// await this.__fetchProvidenceConf();
this._enrichMenuData();
}
updated(changedProperties) {
super.updated(changedProperties);
if (changedProperties.has('__menuData')) {
this._aggregateResults();
}
}
/**
* Gets all selection menu data and creates an aggregated
* '_viewData' result.
*/
async _aggregateResults() {
if (!this.__menuData) {
return;
}
await this.__fetchResults();
const elements = Array.from(this._selectionMenuFormNode.elements);
const repos = elements.filter(n => n.name === 'repos');
const references = elements.filter(n => n.name === 'references');
const activeRefs = [...new Set(checkedValues(references))];
const activeRepos = [...new Set(checkedValues(repos))];
const activeAnalyzer = this._activeAnalyzerNode.value;
const totalQueryOutput = this.__aggregateResultData(activeRefs, activeRepos, activeAnalyzer);
// function addCategories(specifierRes, metaConfig) {
// const resultCats = [];
// if (metaConfig.categoryConfig) {
// const { project, filePath, name } = specifierRes.exportSpecifier;
// // First of all, do we have a matching project?
// // TODO: we should allow different configs for different (major) versions
// const match = metaConfig.categoryConfig.find(cat => cat.project === project);
// console.log('match', match);
// if (match) {
// Object.entries(match.categories, ([categoryName, matchFn]) => {
// if (matchFn(filePath, name)) {
// resultCats.push(categoryName);
// }
// });
// }
// }
// console.log('resultCats', resultCats, metaConfig);
// return resultCats;
// }
// Prepare viewData
const dataResult = [];
// When we support more analyzers than match-imports and match-subclasses, make a switch
// here
totalQueryOutput.forEach((specifierRes, i) => {
dataResult[i] = {};
dataResult[i].specifier = specifierRes.exportSpecifier;
dataResult[i].sourceProject = specifierRes.exportSpecifier.project;
// dataResult[i].categories = undefined; // addCategories(specifierRes, this.__providenceConf);
dataResult[i].type = specifierRes.exportSpecifier.name === '[file]' ? 'file' : 'specifier';
dataResult[i].count = specifierRes.matchesPerProject
.map(mpp => mpp.files)
.flat(Infinity).length;
dataResult[i].matchedProjects = specifierRes.matchesPerProject;
});
this.tableData = dataResult;
}
__aggregateResultData(activeRefs, activeRepos, activeAnalyzer) {
const jsonResultsActiveFilter = [];
activeRefs.forEach(ref => {
const refSearch = `_${ref.replace('#', '_')}_`;
activeRepos.forEach(dep => {
const depSearch = `_${dep.replace('#', '_')}_`;
const found = this.__resultFiles[activeAnalyzer].find(
({ fileName }) => fileName.includes(refSearch) && fileName.includes(depSearch),
);
if (found) {
jsonResultsActiveFilter.push(found.content);
} else {
// eslint-disable-next-line no-console
console.warn(`No result output json for ${refSearch} and ${depSearch}`);
}
});
});
let totalQueryOutput = [];
jsonResultsActiveFilter.forEach(json => {
if (!Array.isArray(json.queryOutput)) {
// can be a string like [no-mactched-dependency]
return;
}
// Start by adding the first entry of totalQueryOutput
if (!totalQueryOutput) {
totalQueryOutput = json.queryOutput;
return;
}
json.queryOutput.forEach(currentRec => {
// Json queryOutput
// Now, look if we already have an "exportSpecifier".
const totalRecFound = totalQueryOutput.find(
totalRec => currentRec.exportSpecifier.id === totalRec.exportSpecifier.id,
);
// If so, concatenate the "matchesPerProject" array to the existing one
if (totalRecFound) {
// TODO: merge smth?
totalRecFound.matchesPerProject = totalRecFound.matchesPerProject.concat(
currentRec.matchesPerProject,
);
}
// If not, just add a new one to the array.
else {
totalQueryOutput.push(currentRec);
}
});
});
return totalQueryOutput;
}
_enrichMenuData() {
const menuData = this.__initialMenuData;
// Object.keys(menuData.searchTargetDeps).forEach((groupName) => {
// menuData.searchTargetDeps[groupName] = menuData.searchTargetDeps[groupName].map(project => (
// { project, checked: true } // check whether we have results, also for active references
// ));
// });
this.__menuData = menuData;
}
/**
* @override
* @param {*} content
*/
// eslint-disable-next-line class-methods-use-this
_renderSpecifier(content) {
let display;
if (content.name === '[file]') {
display = content.filePath;
} else {
display = content.name;
}
const tooltip = content.filePath;
return html`
<div>
<span class="c-tooltip c-tooltip--right" data-tooltip="${tooltip}"> ${display} </span>
</div>
`;
}
/**
* @override
* @param {*} content
* @param {*} header
*/
// eslint-disable-next-line class-methods-use-this
_renderCellContent(content, header) {
if (header === 'specifier') {
return this._renderSpecifier(content);
}
if (header === 'matchedProjects') {
return html`${content
.sort((a, b) => b.files.length - a.files.length)
.map(
mpp => html`
<details>
<summary>
<span style="font-weight:bold;">${mpp.project}</span>
(${mpp.files.length})
</summary>
<ul>
${mpp.files.map(
f => html`<li>${typeof f === 'object' ? JSON.stringify(f) : f}</li>`,
)}
</ul>
</details>
`,
)}`;
}
if (content instanceof Array) {
return content.join(', ');
}
return content;
}
async __fetchMenuData() {
// Derived from providence.conf.js
this.__initialMenuData = await fetch('/menu-data').then(response => response.json());
}
async __fetchProvidenceConf() {
// Gets an
this.__providenceConf = await fetch('/providence.conf.js').then(response => response.json());
}
async __fetchResults() {
this.__resultFiles = await fetch('/results').then(response => response.json());
}
}
customElements.define('p-board', PBoard);

View file

@ -0,0 +1,16 @@
import { css } from 'lit-element';
export const global = css`
:host {
font-family: 'Roboto Condensed', sans-serif;
--primary-color: cornflowerblue;
}
* {
box-sizing: border-box;
}
*:focus {
outline: 2px dotted gray;
}
`;

View file

@ -0,0 +1,45 @@
import { css } from 'lit-element';
// Decoration of white label component 'c-table', which is consumed by webcomponent 'p-table'
export const tableDecoration = css`
:host {
--sort-indicator-color: var(--primary-color);
--header-bg-color: #333;
--header-color: #fff;
}
.c-table__row {
transition: 1s all;
}
.c-table__row:nth-child(2n) {
background: #f7f7f7;
}
.c-table__sort-button {
border: none;
background: none;
padding: 16px;
font-size: 16px;
color: var(--sort-color);
}
.c-table__sort-indicator {
font-size: 12px;
color: var(--sort-indicator-color);
}
.c-table__cell {
padding: 16px;
}
.c-table[mobile] .c-table__cell {
padding: 0;
}
.c-table[mobile] .c-table__cell__header,
.c-table[mobile] .c-table__cell__text {
padding: 16px;
}
`;

View file

@ -0,0 +1,90 @@
import { css } from 'lit-element';
export const tooltip = css`
.c-tooltip {
position: relative;
cursor: pointer;
padding: 8px 0;
}
.c-tooltip::after {
background-color: #eee;
border-radius: 10px;
color: black;
display: none;
padding: 10px 15px;
position: absolute;
text-align: center;
z-index: 999;
}
.c-tooltip::before {
background-color: #333;
content: ' ';
display: none;
position: absolute;
width: 15px;
height: 15px;
z-index: 999;
}
.c-tooltip:hover::after {
display: block;
}
.c-tooltip:hover::before {
display: block;
}
.c-tooltip.c-tooltip--top::after {
content: attr(data-tooltip);
top: 0;
left: 50%;
transform: translate(-50%, calc(-100% - 10px));
}
.c-tooltip.c-tooltip--top::before {
top: 0;
left: 50%;
transform: translate(-50%, calc(-100% - 5px)) rotate(45deg);
}
.c-tooltip.c-tooltip--bottom::after {
content: attr(data-tooltip);
bottom: 0;
left: 50%;
transform: translate(-50%, calc(100% + 10px));
}
.c-tooltip.c-tooltip--bottom::before {
bottom: 0;
left: 50%;
transform: translate(-50%, calc(100% + 5px)) rotate(45deg);
}
.c-tooltip.c-tooltip--right::after {
content: attr(data-tooltip);
top: 0;
right: 0;
transform: translateX(calc(100% + 10px));
}
.c-tooltip.c-tooltip--right::before {
top: 50%;
right: 0;
transform: translate(calc(100% + 5px), -50%) rotate(45deg);
}
.c-tooltip.c-tooltip--left::after {
content: attr(data-tooltip);
top: 0;
left: 0;
transform: translateX(calc(-100% - 10px));
}
.c-tooltip.c-tooltip--left::before {
top: 50%;
left: 0;
transform: translate(calc(-100% - 5px), -50%) rotate(45deg);
}
`;

View file

@ -0,0 +1,29 @@
import { css } from 'lit-element';
export const utils = css`
.u-bold {
font-weight: bold;
}
.u-mb1 {
margin-bottom: 8px;
}
.u-mt3 {
margin-top: 24px;
}
.u-ml2 {
margin-left: 16px;
}
.u-mv2 {
margin-top: 16px;
margin-bottom: 16px;
}
.u-c-mv2 > * {
margin-top: 16px;
margin-bottom: 16px;
}
`;

View file

@ -0,0 +1,90 @@
import { css } from 'lit-element';
export const tooltipComponentStyles = css`
.c-tooltip {
position: relative;
cursor: pointer;
padding: 8px 0;
}
.c-tooltip::after {
background-color: #eee;
border-radius: 10px;
color: black;
display: none;
padding: 10px 15px;
position: absolute;
text-align: center;
z-index: 999;
}
.c-tooltip::before {
background-color: #333;
content: ' ';
display: none;
position: absolute;
width: 15px;
height: 15px;
z-index: 999;
}
.c-tooltip:hover::after {
display: block;
}
.c-tooltip:hover::before {
display: block;
}
.c-tooltip.c-tooltip--top::after {
content: attr(data-tooltip);
top: 0;
left: 50%;
transform: translate(-50%, calc(-100% - 10px));
}
.c-tooltip.c-tooltip--top::before {
top: 0;
left: 50%;
transform: translate(-50%, calc(-100% - 5px)) rotate(45deg);
}
.c-tooltip.c-tooltip--bottom::after {
content: attr(data-tooltip);
bottom: 0;
left: 50%;
transform: translate(-50%, calc(100% + 10px));
}
.c-tooltip.c-tooltip--bottom::before {
bottom: 0;
left: 50%;
transform: translate(-50%, calc(100% + 5px)) rotate(45deg);
}
.c-tooltip.c-tooltip--right::after {
content: attr(data-tooltip);
top: 0;
right: 0;
transform: translateX(calc(100% + 10px));
}
.c-tooltip.c-tooltip--right::before {
top: 50%;
right: 0;
transform: translate(calc(100% + 5px), -50%) rotate(45deg);
}
.c-tooltip.c-tooltip--left::after {
content: attr(data-tooltip);
top: 0;
left: 0;
transform: translateX(calc(-100% - 10px));
}
.c-tooltip.c-tooltip--left::before {
top: 50%;
left: 0;
transform: translate(calc(-100% - 5px), -50%) rotate(45deg);
}
`;

View file

@ -0,0 +1,40 @@
import { GlobalDecorator } from './GlobalDecorator.js';
// TODO: dedupe via @lion
export const DecorateMixin = superclass => {
// eslint-disable-next-line no-shadow
class DecorateMixin extends superclass {
/**
*
* @param {CssResult[]} styles
* @param {boolean} prepend
*/
static decorateStyles(styles, { prepend } = {}) {
if (!prepend) {
this.__decoratedStyles.push(styles);
} else {
this.__decoratedStylesPrepended.push(styles);
}
}
static decorateMethod(name, fn) {
const originalMethod = this.prototype[name];
this.prototype[name] = (...args) => {
fn(originalMethod, ...args);
};
}
static get styles() {
return [
...GlobalDecorator.globalDecoratedStylesPrepended,
...this.__decoratedStylesPrepended,
...(super.styles || []),
...GlobalDecorator.globalDecoratedStyles,
...this.__decoratedStyles,
];
}
}
DecorateMixin.__decoratedStyles = [];
DecorateMixin.__decoratedStylesPrepended = [];
return DecorateMixin;
};

View file

@ -0,0 +1,15 @@
export class GlobalDecorator {
/**
* @param { CssResult[] } styles
* @param { boolean } prepend
*/
static decorateStyles(styles, { prepend } = {}) {
if (!prepend) {
this.globalDecoratedStyles.push(styles);
} else {
this.globalDecoratedStylesPrepended.push(styles);
}
}
}
GlobalDecorator.globalDecoratedStylesPrepended = [];
GlobalDecorator.globalDecoratedStyles = [];

View file

@ -0,0 +1,14 @@
/**
* @desc Can be called from a button click handler in order to let the end user download a file
* @param {string} filename like 'overview.csv'
* @param {string} content for instance a csv file
*/
export function downloadFile(filename, content) {
const element = document.createElement('a');
element.setAttribute('href', `data:text/plain;charset=utf-8,${encodeURIComponent(content)}`);
element.setAttribute('download', filename);
element.style.display = 'none';
document.body.appendChild(element);
element.click();
document.body.removeChild(element);
}

View file

@ -0,0 +1,18 @@
<!DOCTYPE html>
<html>
<head>
<title>providence-board</title>
<style>
body {
margin: 8px 32px;
}
</style>
<script type="module" src="./app/p-board.js"></script>
</head>
<body>
<p-board></p-board>
</body>
</html>

View file

@ -0,0 +1,114 @@
const fs = require('fs');
const pathLib = require('path');
const { createConfig, startServer } = require('es-dev-server');
const { ReportService } = require('../../src/program/services/ReportService.js');
const { LogService } = require('../../src/program/services/LogService.js');
// eslint-disable-next-line import/no-dynamic-require
const providenceConf = require(`${pathLib.join(process.cwd(), 'providence.conf.js')}`);
let outputFilePaths;
try {
outputFilePaths = fs.readdirSync(ReportService.outputPath);
} catch (_) {
LogService.error(
`Please make sure providence results can be found in ${ReportService.outputPath}`,
);
process.exit(1);
}
const resultFiles = {};
let searchTargetDeps;
const supportedAnalyzers = ['match-imports', 'match-subclasses'];
outputFilePaths.forEach(fileName => {
const content = JSON.parse(
fs.readFileSync(pathLib.join(ReportService.outputPath, fileName), 'utf-8'),
);
if (fileName === 'search-target-deps-file.json') {
searchTargetDeps = content;
} else {
const analyzerName = fileName.split('_-_')[0];
if (!supportedAnalyzers.includes(analyzerName)) {
return;
}
if (!resultFiles[analyzerName]) {
resultFiles[analyzerName] = [];
}
resultFiles[analyzerName].push({ fileName, content });
}
});
function transformToProjectNames(collections) {
const res = {};
// eslint-disable-next-line array-callback-return
Object.entries(collections).map(([key, val]) => {
res[key] = val.map(c => pathLib.basename(c));
});
return res;
}
const pathFromServerRootToHere = `/${pathLib.relative(process.cwd(), __dirname)}`;
const config = createConfig({
port: 8080,
// appIndex: './dashboard/index.html',
// rootDir: process.cwd(),
nodeResolve: true,
// moduleDirs: pathLib.resolve(process.cwd(), 'node_modules'),
watch: false,
open: true,
middlewares: [
// eslint-disable-next-line consistent-return
async (ctx, next) => {
// TODO: Quick and dirty solution: refactor in a nicer way
if (ctx.url.startsWith('/app')) {
ctx.url = `${pathFromServerRootToHere}/${ctx.url}`;
return next();
}
if (ctx.url === '/') {
ctx.url = `${pathFromServerRootToHere}/index.html`;
return next();
}
if (ctx.url === '/results') {
ctx.body = resultFiles;
} else if (ctx.url === '/menu-data') {
// Gathers all data that are relevant to create a configuration menu
// at the top of the dashboard:
// - referenceCollections as defined in providence.conf.js
// - searchTargetCollections (aka programs) as defined in providence.conf.js
// - searchTargetDeps as found in search-target-deps-file.json
// Also do some processing on the presentation of a project, so that it can be easily
// outputted in frontend
let searchTargetCollections;
if (providenceConf.searchTargetCollections) {
searchTargetCollections = transformToProjectNames(providenceConf.searchTargetCollections);
} else {
searchTargetCollections = Object.keys(searchTargetDeps).map(d => d.split('#')[0]);
}
const menuData = {
// N.B. theoratically there can be a mismatch between basename and pkgJson.name,
// but we assume folder names and pkgJson.names to be similar
searchTargetCollections,
referenceCollections: transformToProjectNames(providenceConf.referenceCollections),
searchTargetDeps,
};
ctx.body = menuData;
} else if (ctx.url === '/providence.conf.js') {
// We need to fetch it via server, since it's CommonJS vs es modules...
// require("@babel/core").transform("code", {
// plugins: ["@babel/plugin-transform-modules-commonjs"]
// });
// Gives back categories from providence.conf
ctx.body = providenceConf.metaConfig;
} else {
await next();
}
},
],
});
(async () => {
await startServer(config);
})();

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 918 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4 MiB

View file

@ -0,0 +1,91 @@
[//]: # 'AUTO INSERT HEADER PREPUBLISH'
# Analyzer
```js script
export default {
title: 'Providence/Analyzer',
};
```
Analyzers form the core of Providence. They contain predefined queries based on AST traversal/analysis.
A few examples are:
- find-imports
- find-exports
- match-imports
An analyzer will give back a [QueryResult](./QueryResult.md) that will be written to the
file system by Providence.
All analyzers need to extend from the `Analyzer` base class, found in `src/program/analyzers/helpers`.
## Public api
Providence has the following configuration api:
- name (string)
- requiresReference (boolean)
An analyzer will always need a targetProjectPath and can optionally have a referenceProjectPath.
In the latter case, it needs to have `requiresReference: true` configured.
During AST traversal, the following api can be consulted
- `.targetData`
- `.referenceData`
- `.identifier`
## Phases
### Prepare phase
In this phase, all preparations will be done to run the analysis.
Providence is designed to be performant and therefore will first look if it finds an
already existing, cached result for the current setup.
### Traverse phase
The ASTs are created for all projects involved and the data are extracted into a QueryOutput.
This output can optionally be post processed.
### Finalize phase
The data are normalized and written to the filesystem in JSON format
## Targets and references
Every Analyzer needs a targetProjectPath. A targetProjectPath is a file path String that
## Types
We can roughly distinguish two types of analyzers: those that require a reference and those that
don't require a reference.
## Database
In order to share data across multiple machines, results are written to the filesystem in a
"machine agnostic" way.
They can be shared through git and serve as a local database.
### Caching
In order to make caching possible, Providence creates an "identifier": a hash from the combination of project versions + Analyzer configuration. When an identifier already exists in the filesystem,
the result can be read from cache.
This increases performance and helps mitigate memory problems that can occur when handling large
amounts of data in a batch.
## Analyzer helpers
Inside the folder './src/program/analyzers', a folder 'helpers' is found.
Helpers are created specifically for use within analyzers and have knowledge about
the context of the analyzer (knowledge about an AST and/or QueryResult structure).
Generic functionality (that can be applied in any context) can be found in './src/program/utils'
## Post processors
Post processors are imported by analyzers and act on their outputs. They can be enabled via
the configuration of an analyzer. They can be found in './src/program/analyzers/post-processors'
For instance: transform the output of analyzer 'find-imports' by sorting on specifier instead of
the default (entry).
Other than most configurations of analyzers, post processors act on the total result of all analyzed files
instead of just one file/ ast entry.

View file

@ -0,0 +1,32 @@
[//]: # 'AUTO INSERT HEADER PREPUBLISH'
# Dashboard
```js script
export default {
title: 'Providence/Dashboard',
};
```
An interactive overview of all aggregated [QueryResults]('./QueryResult.md') can be found in the dashboard.
The dashboard is a small nodejs server (based on es-dev-server + middleware) and a frontend
application.
## Run
Start the dashboard via `yarn dashboard` to automatically open the browser and start the dashboard.
## Interface
- Select all reference projects
- Select all target projects
Press `show table` to see the result based on the updated configuration.
### Generate csv
When `get csv` is pressed, a `.csv` will be downloaded that can be loaded into Excel.
## Analyzer support
Currently, only the `match-imports` is supported, more analyzers will be added in the future.

View file

@ -0,0 +1,71 @@
[//]: # 'AUTO INSERT HEADER PREPUBLISH'
# Local configuration
```js script
export default {
title: 'Providence/LocalConfiguration',
};
```
The file `providence.conf.js` is read by providence cli and by the dashboard to get all
default configurations.
## Meta data
### Category info
Based on the filePath of a result, a category can be added.
For example:
```js
metaConfig: {
categoryConfig: [
{
// This is the name found in package.json
project: '@lion/root',
// These conditions will be run on overy filePath
categories: {
core: p => p.startsWith('./packages/core'),
utils: p => p.startsWith('./packages/ajax') || p.startsWith('./packages/localize'),
overlays: p =>
p.startsWith('./packages/overlays') ||
p.startsWith('./packages/dialog') ||
p.startsWith('./packages/tooltip'),
...
},
},
],
},
```
> N.B. category info is regarded as subjective, therefore it's advised to move this away from
> Analyzers (and thus file-system cache). Categories can be added realtime in the dashboard.
## Project paths
### referenceCollections
A list of file system paths. They can be defined relative from the current project root (`process.cwd()`) or they can be full paths.
When a [MatchAnalyzer]('./Analyzer.md') like `match-imports` or `match-subclasses` is used,
the default reference(s) can be configured here. For instance: ['/path/to/@lion/form']
An example:
```js
referenceCollections: {
// Our products
'lion-based-ui': [
'./providence-input-data/references/lion-based-ui',
'./providence-input-data/references/lion-based-ui-labs',
],
...
}
```
### searchTargetCollections
A list of file system paths. They can be defined relative from the current project root
(`process.cwd()`) or they can be full paths.
When not defined, the current project will be the search target (this is most common when
providence is used as a dev dependency)

View file

@ -0,0 +1,120 @@
[//]: # 'AUTO INSERT HEADER PREPUBLISH'
# QueryResult
```js script
export default {
title: 'Providence/QueryResult',
};
```
When an Analyzer has run, it returns a QueryResult. This is a json object that contains all
meta info (mainly configuration parameters) and the query output.
A QueryResult always contains the analysis of one project (a target project). Optionally,
it can contain a reference project as well.
## Anatomy
A QueryResult starts with a meta section, followed by the actual results
### Meta
The meta section lists all configuration options the analyzer was run with. Here, you see an
example of a `find-imports` QueryResult:
```js
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "find-imports",
"requiredAst": "babel",
"identifier": "importing-target-project_0.0.2-target-mock__1970011674",
"targetProject": {
"name": "importing-target-project",
"commitHash": "3e5014d6ecdff1fc71138cdb29aaf7bf367588f5",
"version": "0.0.2-target-mock"
},
"configuration": {
"keepInternalSources": false
}
}
},
```
### Output
The output is usually more specifically tied to the Analyzer. What most regular Analyzers
(not being MatchAnalyzers that require a referenceProjectPath) have in common, is that their
results are being shown per "entry" (an entry corresponds with an AST generated by Babel, which in
turn corresponds to a file found in a target or reference project).
Below an example is shown of `find-imports` QueryOutput:
```js
"queryOutput": [
{
"project": {
"name": "importing-target-project",
"mainEntry": "./target-src/match-imports/root-level-imports.js",
"version": "0.0.2-target-mock",
"commitHash": "3e5014d6ecdff1fc71138cdb29aaf7bf367588f5"
},
"entries": [
{
"file": "./target-src/find-imports/all-notations.js",
"result": [
{
"importSpecifiers": [
"[file]"
],
"source": "imported/source",
"normalizedSource": "imported/source",
"fullSource": "imported/source"
},
{
"importSpecifiers": [
"[default]"
],
"source": "imported/source-a",
"normalizedSource": "imported/source-a",
"fullSource": "imported/source-a"
},
...
```
MatchAnalyzers usually do post processing on the entries. The output below (for the `match-imports`
Analyzer) shows an ordering by matched specifier.
```js
"queryOutput": [
{
"exportSpecifier": {
"name": "[default]",
"project": "exporting-ref-project",
"filePath": "./index.js",
"id": "[default]::./index.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/root-level-imports.js",
"./target-src/match-subclasses/internalProxy.js"
]
}
]
},
...
```
Due to some legacy decisions, the QueryOutput allows for multiple target- and reference projects.
Aggregation of data now takes place in the dashboard.
QueryOutputs always contain one or a combination of two projects. This means that the
QueryOutput structure could be simplified in the future.
## Environment agnosticism
The output files stored in the file system always need to be machine independent:
this means that all machine specific information, like a complete filepath, needs to be removed from a QueryOutput (paths relative from project root are still allowed).
In that way, the caching mechanism (based on hash comparisons) as described in [Analyzer]('./Analyzer.md') is
guaruanteed to work across different machines.

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,72 @@
{
"name": "providence-analytics",
"version": "0.0.0",
"description": "Providence is the 'All Seeing Eye' that measures effectivity and popularity of software. Release management will become highly efficient due to an accurate impact analysis of (breaking) changes",
"license": "MIT",
"author": "ing-bank",
"homepage": "https://github.com/ing-bank/lion/",
"repository": {
"type": "git",
"url": "https://github.com/ing-bank/lion.git",
"directory": "packages/providence-analytics"
},
"main": "./src/program/providence.js",
"bin": {
"providence": "./src/cli/index.js"
},
"files": [
"dashboard/src",
"src"
],
"scripts": {
"dashboard": "node ./dashboard/src/server.js",
"providence": "node --max-old-space-size=8192 ./src/cli/index.js",
"test:node": "mocha './test-node/program/**/*.test.js'",
"test:node:e2e": "mocha './test-node/program/**/*.e2e.js' --timeout 60000",
"test:node:watch": "yarn test:node --watch"
},
"dependencies": {
"@babel/core": "^7.10.1",
"@babel/parser": "^7.5.5",
"@babel/plugin-proposal-class-properties": "^7.8.3",
"@babel/register": "^7.5.5",
"@babel/traverse": "^7.5.5",
"@babel/types": "^7.9.0",
"@rollup/plugin-node-resolve": "^7.1.1",
"@typescript-eslint/typescript-estree": "^2.0.0",
"chalk": "^2.4.2",
"commander": "^2.20.0",
"deepmerge": "^4.0.0",
"es-dev-server": "^1.18.1",
"es-module-lexer": "^0.3.6",
"glob": "^7.1.6",
"htm": "^3.0.3",
"inquirer": "^7.0.0",
"lit-element": "^2.2.1",
"ora": "^3.4.0",
"parse5": "^5.1.1",
"read-package-tree": "5.3.1",
"semver": "^7.1.3",
"typescript": "^3.6.4"
},
"devDependencies": {
"mermaid": "^8.2.6",
"mock-fs": "^4.10.1",
"nyc": "^15.0.0",
"ssl-root-cas": "^1.3.1"
},
"keywords": [
"analysis",
"impact",
"insight",
"metrics",
"providence",
"quality",
"release management",
"semver",
"software"
],
"publishConfig": {
"access": "public"
}
}

View file

@ -0,0 +1,45 @@
// This file is read by dashboard and cli and needs to be present under process.cwd()
// It mainly serves as an example and it allows to run the dashboard locally
// from within this repo.
const providenceConfig = {
metaConfig: {
categoryConfig: [
{
// This is the name found in package.json
project: 'lion-based-ui',
majorVersion: 1,
// These conditions will be run on overy filePath
categories: {
overlays: localFilePath => {
const names = ['dialog', 'tooltip'];
const fromPackages = names.some(p => localFilePath.startsWith(`./packages/${p}`));
const fromRoot =
names.some(p => localFilePath.startsWith(`./ui-${p}`)) ||
localFilePath.startsWith('./overlays.js');
return fromPackages || fromRoot;
},
// etc...
},
},
],
},
// By predefening groups, we can do a query for programs/collections...
// Select via " providence analyze -t 'exampleCollection' "
searchTargetCollections: {
exampleCollection: [
'./providence-input-data/search-targets/example-project-a',
'./providence-input-data/search-targets/example-project-b',
],
// ...
},
referenceCollections: {
// Our products
'lion-based-ui': [
'./providence-input-data/references/lion-based-ui',
'./providence-input-data/references/lion-based-ui-labs',
],
},
};
module.exports = providenceConfig;

View file

@ -0,0 +1,17 @@
#!/usr/bin/env bash
# See https://gist.github.com/myusuf3/7f645819ded92bda6677
if [ -z "$1" ]; then
echo "Please define 'path/to/submodule'";
exit;
fi
# Remove the submodule entry from .git/config
git submodule deinit -f $1
# Remove the submodule directory from the superproject's .git/modules directory
rm -rf .git/modules/$1
# Remove the entry in .gitmodules and remove the submodule directory located at path/to/submodule
git rm -rf $1

View file

@ -0,0 +1,162 @@
/* eslint-disable no-shadow */
const pathLib = require('path');
const child_process = require('child_process'); // eslint-disable-line camelcase
const glob = require('glob');
const readPackageTree = require('../program/utils/read-package-tree-with-bower-support.js');
const { InputDataService } = require('../program/services/InputDataService.js');
const { LogService } = require('../program/services/LogService.js');
const { aForEach } = require('../program/utils/async-array-utils.js');
function csToArray(v) {
return v.split(',').map(v => v.trim());
}
function extensionsFromCs(v) {
return csToArray(v).map(v => `.${v}`);
}
function setQueryMethod(m) {
const allowedMehods = ['grep', 'ast'];
if (allowedMehods.includes(m)) {
return m;
}
// eslint-disable-next-line no-console
LogService.error(`Please provide one of the following methods: ${allowedMehods.join(', ')}`);
return undefined;
}
/**
* @returns {string[]}
*/
function pathsArrayFromCs(t) {
return t
.split(',')
.map(t => {
const isGlob = t.includes('*');
if (isGlob) {
return glob.sync(t);
}
return pathLib.resolve(process.cwd(), t.trim());
})
.flat();
}
/**
* @param {string} name collection name found in eCfg
* @param {'search-target'|'reference'} [colType='search-targets'] collectioon type
* @param {object} eCfg external configuration. Usually providence.conf.js
* @returns {string[]}
*/
function pathsArrayFromCollectionName(name, colType = 'search-target', eCfg) {
let collection;
if (colType === 'search-target') {
collection = eCfg.searchTargetCollections;
} else if (colType === 'reference') {
collection = eCfg.referenceCollections;
}
if (collection && collection[name]) {
return pathsArrayFromCs(collection[name].join(','));
}
return undefined;
}
function spawnProcess(processArgStr, opts, { log } = {}) {
const processArgs = processArgStr.split(' ');
const proc = child_process.spawn(processArgs[0], processArgs.slice(1), opts);
let output;
proc.stdout.on('data', data => {
output += data;
if (log) {
LogService.debug(data);
}
});
return new Promise((resolve, reject) => {
proc.stderr.on('data', data => {
if (log) {
LogService.error(data);
}
reject(data.toString());
});
proc.on('close', code => {
resolve({ code, output });
});
});
}
/**
* @returns {string[]}
*/
function targetDefault() {
// eslint-disable-next-line import/no-dynamic-require, global-require
const { name } = require(`${process.cwd()}/package.json`);
if (name === 'providence') {
return InputDataService.getTargetProjectPaths();
}
return [process.cwd()];
}
/**
* @desc Returns all sub projects matching condition supplied in matchFn
* @param {string[]} searchTargetPaths all search-target project paths
* @param {function} matchFn filters out packages we're interested in
* @param {string[]} modes
*/
async function appendProjectDependencyPaths(rootPaths, matchFn, modes = ['npm', 'bower']) {
const depProjectPaths = [];
await aForEach(rootPaths, async targetPath => {
await aForEach(modes, async mode => {
await readPackageTree(
targetPath,
matchFn,
(err, tree) => {
if (err) {
throw new Error(err);
}
const paths = tree.children.map(child => child.realpath);
depProjectPaths.push(...paths);
},
mode,
);
});
});
// Write all data to {outputPath}/projectDeps.json
// const projectDeps = {};
// rootPaths.forEach(rootP => {
// depProjectPaths.filter(depP => depP.startsWith(rootP)).;
// });
return depProjectPaths.concat(rootPaths);
}
async function installDeps(searchTargetPaths) {
return aForEach(searchTargetPaths, async t => {
const spawnConfig = { cwd: t };
const extraOptions = { log: true };
LogService.info(`Installing npm dependencies for ${pathLib.basename(t)}`);
try {
await spawnProcess('npm i --no-progress', spawnConfig, extraOptions);
} catch (e) {
LogService.error(e);
}
LogService.info(`Installing bower dependencies for ${pathLib.basename(t)}`);
try {
await spawnProcess(`bower i --production --force-latest`, spawnConfig, extraOptions);
} catch (e) {
LogService.error(e);
}
});
}
module.exports = {
csToArray,
extensionsFromCs,
setQueryMethod,
pathsArrayFromCs,
targetDefault,
appendProjectDependencyPaths,
spawnProcess,
installDeps,
pathsArrayFromCollectionName,
};

View file

@ -0,0 +1,43 @@
/* eslint-disable import/no-extraneous-dependencies */
const fs = require('fs');
const pathLib = require('path');
const { performance } = require('perf_hooks');
const { providence } = require('../program/providence.js');
const { QueryService } = require('../program/services/QueryService.js');
const { LogService } = require('../program/services/LogService.js');
async function launchProvidenceWithExtendDocs(referencePaths, prefixObj, outputFolder) {
const t0 = performance.now();
const results = await providence(
QueryService.getQueryConfigFromAnalyzer('match-paths', { prefix: prefixObj }),
{
gatherFilesConfig: {
extensions: ['.js', '.html'],
excludeFolders: ['coverage', 'test'],
},
queryMethod: 'ast',
report: false,
targetProjectPaths: [pathLib.resolve(process.cwd())],
referenceProjectPaths: referencePaths,
},
);
const outputFilePath = pathLib.join(outputFolder, 'providence-extend-docs-data.json');
const queryOutputs = results.map(result => result.queryOutput).flat();
if (fs.existsSync(outputFilePath)) {
fs.unlinkSync(outputFilePath);
}
fs.writeFile(outputFilePath, JSON.stringify(queryOutputs), err => {
if (err) {
throw err;
}
});
const t1 = performance.now();
LogService.info(`"extend-docs" completed in ${Math.round((t1 - t0) / 1000)} seconds`);
}
module.exports = {
launchProvidenceWithExtendDocs,
};

View file

@ -0,0 +1,273 @@
#!/usr/bin/env node
// @ts-ignore-next-line
require('../program/types/index.js');
const child_process = require('child_process'); // eslint-disable-line camelcase
const pathLib = require('path');
const commander = require('commander');
const { providence } = require('../program/providence.js');
const { LogService } = require('../program/services/LogService.js');
const { QueryService } = require('../program/services/QueryService.js');
const { InputDataService } = require('../program/services/InputDataService.js');
const { promptAnalyzerMenu, promptAnalyzerConfigMenu } = require('./prompt-analyzer-menu.js');
const {
extensionsFromCs,
setQueryMethod,
targetDefault,
appendProjectDependencyPaths,
installDeps,
pathsArrayFromCollectionName,
pathsArrayFromCs,
} = require('./cli-helpers.js');
const { launchProvidenceWithExtendDocs } = require('./generate-extend-docs-data.js');
// @ts-ignore-next-line
const { version } = require('../../package.json');
/** @type {'analyzer'|'queryString'} */
let searchMode;
/** @type {object} */
let analyzerOptions;
/** @type {object} */
let featureOptions;
/** @type {object} */
let regexSearchOptions;
const externalConfig = InputDataService.getExternalConfig();
// eslint-disable-next-line no-shadow
async function getQueryInputData(searchMode, regexSearchOptions, featureOptions, analyzerOptions) {
let queryConfig = null;
let queryMethod = null;
if (searchMode === 'search-query') {
queryConfig = QueryService.getQueryConfigFromRegexSearchString(regexSearchOptions.regexString);
queryMethod = 'grep';
} else if (searchMode === 'feature-query') {
queryConfig = QueryService.getQueryConfigFromFeatureString(featureOptions.queryString);
queryMethod = 'grep';
} else if (searchMode === 'analyzer-query') {
let { name, config } = analyzerOptions;
if (!name) {
const answers = await promptAnalyzerMenu();
name = answers.analyzerName;
}
if (!config) {
const answers = await promptAnalyzerConfigMenu(name, analyzerOptions.promptOptionalConfig);
config = answers.analyzerConfig;
}
// Will get metaConfig from ./providence.conf.js
const metaConfig = externalConfig ? externalConfig.metaConfig : {};
config = { ...config, metaConfig };
queryConfig = QueryService.getQueryConfigFromAnalyzer(name, config);
queryMethod = 'ast';
} else {
LogService.error('Please define a feature, analyzer or search');
process.exit(1);
}
return { queryConfig, queryMethod };
}
async function launchProvidence() {
const { queryConfig, queryMethod } = await getQueryInputData(
searchMode,
regexSearchOptions,
featureOptions,
analyzerOptions,
);
const searchTargetPaths = commander.searchTargetCollection || commander.searchTargetPaths;
let referencePaths;
if (queryConfig.analyzer.requiresReference) {
referencePaths = commander.referenceCollection || commander.referencePaths;
}
// const extendedSearchTargets = searchTargetPaths;
const extendedSearchTargets = await appendProjectDependencyPaths(searchTargetPaths);
// TODO: filter out:
// - dependencies listed in reference (?) Or at least, inside match-imports, make sure that
// we do not test against ourselves...
// -
providence(queryConfig, {
gatherFilesConfig: {
extensions: commander.extensions,
...(commander.filteredTarget ? { excludeFolders: commander.filteredTarget } : {}),
includePaths: commander.whitelist,
},
gatherFilesConfigReference: {
extensions: commander.extensions,
...(commander.filteredTarget ? { excludeFolders: commander.filteredTarget } : {}),
includePaths: commander.whitelistReference,
},
debugEnabled: commander.debug,
queryMethod,
targetProjectPaths: extendedSearchTargets,
referenceProjectPaths: referencePaths,
targetProjectRootPaths: searchTargetPaths,
writeLogFile: commander.writeLogFile,
});
}
async function manageSearchTargets(options) {
const basePath = pathLib.join(__dirname, '../..');
if (options.update) {
LogService.info('git submodule update --init --recursive');
const updateResult = child_process.execSync('git submodule update --init --recursive', {
cwd: basePath,
});
LogService.info(String(updateResult));
}
if (options.deps) {
await installDeps(commander.searchTargetPaths);
}
if (options.createVersionHistory) {
await installDeps(commander.searchTargetPaths);
}
}
commander
.version(version, '-v, --version')
.option('-e, --extensions [extensions]', 'extensions like ".js, .html"', extensionsFromCs, [
'.js',
'.html',
])
.option('-D, --debug', 'shows extensive logging')
.option(
'-t, --search-target-paths [targets]',
`path(s) to project(s) on which analysis/querying should take place. Requires
a list of comma seperated values relative to project root`,
pathsArrayFromCs,
targetDefault(),
)
.option(
'-r, --reference-paths [references]',
`path(s) to project(s) which serve as a reference (applicable for certain analyzers like
'match-imports'). Requires a list of comma seperated values relative to
project root (like 'node_modules/lion-based-ui, node_modules/lion-based-ui-labs').`,
pathsArrayFromCs,
InputDataService.referenceProjectPaths,
)
.option(
'-w, --whitelist [whitelist]',
`whitelisted paths, like './src, ./packages/*'`,
pathsArrayFromCs,
)
.option(
'--whitelist-reference [whitelist-reference]',
`whitelisted paths for reference, like './src, ./packages/*'`,
pathsArrayFromCs,
)
.option(
'--search-target-collection [collection-name]',
`path(s) to project(s) which serve as a reference (applicable for certain analyzers like
'match-imports'). Should be a collection defined in providence.conf.js as paths relative to
project root.`,
v => pathsArrayFromCollectionName(v, 'search-target', externalConfig),
)
.option(
'--reference-collection [collection-name]',
`path(s) to project(s) on which analysis/querying should take place. Should be a collection
defined in providence.conf.js as paths relative to project root.`,
v => pathsArrayFromCollectionName(v, 'reference', externalConfig),
)
.option('--write-log-file', `Writes all logs to 'providence.log' file`);
commander
.command('search <regex>')
.alias('s')
.description('perfoms regex search string like "my-.*-comp"')
.action((regexString, options) => {
searchMode = 'search-query';
regexSearchOptions = options;
regexSearchOptions.regexString = regexString;
launchProvidence();
});
commander
.command('feature <query-string>')
.alias('f')
.description('query like "tg-icon[size=xs]"')
.option('-m, --method [method]', 'query method: "grep" or "ast"', setQueryMethod, 'grep')
.action((queryString, options) => {
searchMode = 'feature-query';
featureOptions = options;
featureOptions.queryString = queryString;
launchProvidence();
});
commander
.command('analyze [analyzer-name]')
.alias('a')
.description(
`predefined "query" for ast analysis. Can be a script found in program/analyzers,
like "find-imports"`,
)
.option(
'-o, --prompt-optional-config',
`by default, only required configuration options are
asked for. When this flag is provided, optional configuration options are shown as well`,
)
.option('-c, --config [config]', 'configuration object for analyzer', c => JSON.parse(c))
.action((analyzerName, options) => {
searchMode = 'analyzer-query';
analyzerOptions = options;
analyzerOptions.name = analyzerName;
launchProvidence();
});
commander
.command('extend-docs')
.alias('e')
.description(
`Generates data for "babel-extend-docs" plugin. These data are generated by the "match-paths"
plugin, which automatically resolves import paths from reference projects
(say [@lion/input, @lion/textarea, ...etc]) to a target project (say "wolf-ui").`,
)
.option(
'--prefix-from [prefix-from]',
`Prefix for components of reference layer. By default "lion"`,
a => a,
'lion',
)
.option(
'--prefix-to [prefix-to]',
`Prefix for components of reference layer. For instance "wolf"`,
)
.option(
'--output-folder [output-folder]',
`This is the file path where the result file "providence-extend-docs-data.json" will be written to`,
p => pathLib.resolve(process.cwd(), p.trim()),
process.cwd(),
)
.action(options => {
if (!options.prefixTo) {
LogService.error(`Please provide a "prefix to" like '--prefix-to "myprefix"'`);
process.exit(1);
}
if (!commander.referencePaths) {
LogService.error(`Please provide referencePaths path like '-r "node_modules/@lion/*"'`);
process.exit(1);
}
const prefixCfg = { from: options.prefixFrom, to: options.prefixTo };
launchProvidenceWithExtendDocs(commander.referencePaths, prefixCfg, options.outputFolder);
});
commander
.command('manage-projects')
.description(
`Before running a query, be sure to have search-targets up to date (think of
npm/bower dependencies, latest version etc.)`,
)
.option('-u, --update', 'gets latest of all search-targets and references')
.option('-d, --deps', 'installs npm/bower dependencies of search-targets')
.option('-h, --create-version-history', 'gets latest of all search-targets and references')
.action(options => {
manageSearchTargets(options);
});
commander.parse(process.argv);

View file

@ -0,0 +1,133 @@
const fs = require('fs');
const pathLib = require('path');
const inquirer = require('inquirer');
const { default: traverse } = require('@babel/traverse');
const { InputDataService } = require('../program/services/InputDataService.js');
const { AstService } = require('../program/services/AstService.js');
const { LogService } = require('../program/services/LogService.js');
const JsdocCommentParser = require('../program/utils/jsdoc-comment-parser.js');
/**
* @desc extracts name, defaultValue, optional, type, desc from JsdocCommentParser.parse method
* result
* @param {array} jsdoc
* @returns {object}
*/
function getPropsFromParsedJsDoc(jsdoc) {
const jsdocProps = jsdoc.filter(p => p.tagName === '@property');
const options = jsdocProps.map(({ tagValue }) => {
// eslint-disable-next-line no-unused-vars
const [_, type, nameOptionalDefault, desc] = tagValue.match(/\{(.*)\}\s*([^\s]*)\s*(.*)/);
let nameDefault = nameOptionalDefault;
let optional = false;
if (nameOptionalDefault.startsWith('[') && nameOptionalDefault.endsWith(']')) {
optional = true;
nameDefault = nameOptionalDefault.slice(1).slice(0, -1);
}
const [name, defaultValue] = nameDefault.split('=');
return { name, defaultValue, optional, type, desc };
});
return options;
}
function getAnalyzerOptions(file) {
const code = fs.readFileSync(file, 'utf8');
const ast = AstService.getAst(code, 'babel', { filePath: file });
let commentNode;
traverse(ast, {
// eslint-disable-next-line no-shadow
VariableDeclaration(path) {
if (!path.node.leadingComments) {
return;
}
const decls = path.node.declarations || [];
decls.forEach(decl => {
if (decl && decl.id && decl.id.name === 'cfg') {
[commentNode] = path.node.leadingComments;
}
});
},
});
if (commentNode) {
const jsdoc = JsdocCommentParser.parse(commentNode);
return getPropsFromParsedJsDoc(jsdoc);
}
return undefined;
}
function gatherAnalyzers(dir, getConfigOptions) {
return InputDataService.gatherFilesFromDir(dir, { depth: 0 }).map(file => {
const analyzerObj = { file, name: pathLib.basename(file, '.js') };
if (getConfigOptions) {
analyzerObj.options = getAnalyzerOptions(file);
}
return analyzerObj;
});
}
async function promptAnalyzerConfigMenu(
analyzerName,
promptOptionalConfig,
dir = pathLib.resolve(__dirname, '../program/analyzers'),
) {
const menuOptions = gatherAnalyzers(dir, true);
const analyzer = menuOptions.find(o => o.name === analyzerName);
if (!analyzer) {
LogService.error(`[promptAnalyzerConfigMenu] analyzer "${analyzerName}" not found.`);
process.exit(1);
}
let configAnswers;
if (analyzer.options) {
configAnswers = await inquirer.prompt(
analyzer.options
.filter(a => promptOptionalConfig || !a.optional)
.map(a => ({
name: a.name,
message: a.description,
...(a.defaultValue ? { default: a.defaultValue } : {}),
})),
);
Object.entries(configAnswers).forEach(([key, value]) => {
const { type } = analyzer.options.find(o => o.name === key);
if (type.toLowerCase() === 'boolean') {
configAnswers[key] = value === 'false' ? false : Boolean(value);
} else if (type.toLowerCase() === 'number') {
configAnswers[key] = Number(value);
} else if (type.toLowerCase() !== 'string') {
if (value) {
configAnswers[key] = JSON.parse(value);
} else {
// Make sure to not override predefined values with undefined ones
delete configAnswers[key];
}
}
});
}
return {
analyzerConfig: configAnswers,
};
}
async function promptAnalyzerMenu(dir = pathLib.resolve(__dirname, '../program/analyzers')) {
const menuOptions = gatherAnalyzers(dir);
const answers = await inquirer.prompt([
{
type: 'list',
name: 'analyzerName',
message: 'Which analyzer do you want to run?',
choices: menuOptions.map(o => o.name),
},
]);
return {
analyzerName: answers.analyzerName,
};
}
module.exports = {
promptAnalyzerMenu,
promptAnalyzerConfigMenu,
};

View file

@ -0,0 +1,250 @@
/* eslint-disable no-shadow, no-param-reassign */
const pathLib = require('path');
const t = require('@babel/types');
const { default: traverse } = require('@babel/traverse');
const { Analyzer } = require('./helpers/Analyzer.js');
const { trackDownIdentifierFromScope } = require('./helpers/track-down-identifier.js');
const { aForEach } = require('../utils/async-array-utils.js');
/** @typedef {import('./types').FindClassesAnalyzerOutput} FindClassesAnalyzerOutput */
/** @typedef {import('./types').FindClassesAnalyzerOutputEntry} FindClassesAnalyzerOutputEntry */
/** @typedef {import('./types').FindClassesConfig} FindClassesConfig */
/**
* @desc Finds import specifiers and sources
* @param {BabelAst} ast
* @param {string} relativePath the file being currently processed
*/
async function findMembersPerAstEntry(ast, fullCurrentFilePath, projectPath) {
// The transformed entry
const classesFound = [];
/**
* @desc Detects private/publicness based on underscores. Checks '$' as well
* @returns {'public|protected|private'}
*/
function computeAccessType(name) {
if (name.startsWith('_') || name.startsWith('$')) {
// (at least) 2 prefixes
if (name.startsWith('__') || name.startsWith('$$')) {
return 'private';
}
return 'protected';
}
return 'public';
}
function isStaticProperties({ node }) {
return node.static && node.kind === 'get' && node.key.name === 'properties';
}
// function isBlacklisted({ node }) {
// // Handle static getters
// const sgBlacklistPlatform = ['attributes'];
// const sgBlacklistLitEl = ['properties', 'styles'];
// const sgBlacklistLion = ['localizeNamespaces'];
// const sgBlacklist = [...sgBlacklistPlatform, ...sgBlacklistLitEl, ...sgBlacklistLion];
// if (node.kind === 'get' && node.static && sgBlacklist.includes(node.key.name)) {
// return true;
// }
// // Handle getters
// const gBlacklistLitEl = ['updateComplete'];
// const gBlacklistLion = ['slots'];
// const gBlacklist = [...gBlacklistLion, ...gBlacklistLitEl];
// if (node.kind === 'get' && !node.static && gBlacklist.includes(node.key.name)) {
// return true;
// }
// // Handle methods
// const mBlacklistPlatform = ['constructor', 'connectedCallback', 'disconnectedCallback'];
// const mBlacklistLitEl = [
// '_requestUpdate',
// 'createRenderRoot',
// 'render',
// 'updated',
// 'firstUpdated',
// 'update',
// 'shouldUpdate',
// ];
// const mBlacklistLion = ['onLocaleUpdated'];
// const mBlacklist = [...mBlacklistPlatform, ...mBlacklistLitEl, ...mBlacklistLion];
// if (!node.static && mBlacklist.includes(node.key.name)) {
// return true;
// }
// return false;
// }
async function traverseClass(path, { isMixin } = {}) {
const classRes = {};
classRes.name = path.node.id && path.node.id.name;
classRes.isMixin = Boolean(isMixin);
if (path.node.superClass) {
const superClasses = [];
// Add all Identifier names
let parent = path.node.superClass;
while (parent.type === 'CallExpression') {
superClasses.push({ name: parent.callee.name, isMixin: true });
// As long as we are a CallExpression, we will have a parent
[parent] = parent.arguments;
}
// At the end of the chain, we find type === Identifier
superClasses.push({ name: parent.name, isMixin: false });
// For all found superclasses, track down their root location.
// This will either result in a local, relative path in the project,
// or an external path like '@lion/overlays'. In the latter case,
// tracking down will halt and should be done when there is access to
// the external repo... (similar to how 'match-imports' analyzer works)
await aForEach(superClasses, async classObj => {
// Finds the file that holds the declaration of the import
classObj.rootFile = await trackDownIdentifierFromScope(
path,
classObj.name,
fullCurrentFilePath,
projectPath,
);
});
classRes.superClasses = superClasses;
}
classRes.members = {};
classRes.members.props = []; // meta: private, public, getter/setter, (found in static get properties)
classRes.members.methods = []; // meta: private, public, getter/setter
path.traverse({
ClassMethod(path) {
// if (isBlacklisted(path)) {
// return;
// }
if (isStaticProperties(path)) {
let hasFoundTopLvlObjExpr = false;
path.traverse({
ObjectExpression(path) {
if (hasFoundTopLvlObjExpr) return;
hasFoundTopLvlObjExpr = true;
path.node.properties.forEach(objectProperty => {
if (!t.isProperty(objectProperty)) {
// we can also have a SpreadElement
return;
}
const propRes = {};
const { name } = objectProperty.key;
propRes.name = name;
propRes.accessType = computeAccessType(name);
propRes.kind = [...(propRes.kind || []), objectProperty.kind];
classRes.members.props.push(propRes);
});
},
});
return;
}
const methodRes = {};
const { name } = path.node.key;
methodRes.name = name;
methodRes.accessType = computeAccessType(name);
if (path.node.kind === 'set' || path.node.kind === 'get') {
if (path.node.static) {
methodRes.static = true;
}
methodRes.kind = [...(methodRes.kind || []), path.node.kind];
// Merge getter/setters into one
const found = classRes.members.props.find(p => p.name === name);
if (found) {
found.kind = [...(found.kind || []), path.node.kind];
} else {
classRes.members.props.push(methodRes);
}
} else {
classRes.members.methods.push(methodRes);
}
},
});
classesFound.push(classRes);
}
const classesToTraverse = [];
traverse(ast, {
ClassDeclaration(path) {
classesToTraverse.push({ path, isMixin: false });
},
ClassExpression(path) {
classesToTraverse.push({ path, isMixin: true });
},
});
await aForEach(classesToTraverse, async klass => {
await traverseClass(klass.path, { isMixin: klass.isMixin });
});
return classesFound;
}
// // TODO: split up and make configurable
// function _flattenedFormsPostProcessor(queryOutput) {
// // Temp: post process, so that we, per category, per file, get all public props
// queryOutput[0].entries = queryOutput[0].entries
// .filter(entry => {
// // contains only forms (and thus is not a test or demo)
// return entry.meta.categories.includes('forms') && entry.meta.categories.length === 1;
// })
// .map(entry => {
// const newResult = entry.result.map(({ name, props, methods }) => {
// return {
// name,
// props: props.filter(p => p.meta.accessType === 'public').map(p => p.name),
// methods: methods.filter(m => m.meta.accessType === 'public').map(m => m.name),
// };
// });
// return { file: entry.file, result: newResult };
// });
// }
class FindClassesAnalyzer extends Analyzer {
constructor() {
super();
this.name = 'find-classes';
}
/**
* @desc Will find all public members (properties (incl. getter/setters)/functions) of a class and
* will make a distinction between private, public and protected methods
* @param {FindClassesConfig} customConfig
*/
async execute(customConfig = {}) {
/** @type {FindClassesConfig} */
const cfg = {
gatherFilesConfig: {},
targetProjectPath: null,
metaConfig: null,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
/** @type {FindClassesAnalyzerOutput} */
const queryOutput = await this._traverse(async (ast, { relativePath }) => {
const projectPath = cfg.targetProjectPath;
const fullPath = pathLib.resolve(projectPath, relativePath);
const transformedEntry = await findMembersPerAstEntry(ast, fullPath, projectPath);
return { result: transformedEntry };
});
// _flattenedFormsPostProcessor();
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = FindClassesAnalyzer;

View file

@ -0,0 +1,129 @@
const pathLib = require('path');
const t = require('@babel/types');
const { default: traverse } = require('@babel/traverse');
const { Analyzer } = require('./helpers/Analyzer.js');
const { trackDownIdentifierFromScope } = require('./helpers/track-down-identifier.js');
const { aForEach } = require('../utils/async-array-utils.js');
function cleanup(transformedEntry) {
transformedEntry.forEach(definitionObj => {
if (definitionObj.__tmp) {
// eslint-disable-next-line no-param-reassign
delete definitionObj.__tmp;
}
});
return transformedEntry;
}
async function trackdownRoot(transformedEntry, relativePath, projectPath) {
const fullCurrentFilePath = pathLib.resolve(projectPath, relativePath);
await aForEach(transformedEntry, async definitionObj => {
const rootFile = await trackDownIdentifierFromScope(
definitionObj.__tmp.path,
definitionObj.constructorIdentifier,
fullCurrentFilePath,
projectPath,
);
// eslint-disable-next-line no-param-reassign
definitionObj.rootFile = rootFile;
});
return transformedEntry;
}
/**
* @desc Finds import specifiers and sources
* @param {BabelAst} ast
*/
function findCustomElementsPerAstEntry(ast) {
const definitions = [];
traverse(ast, {
CallExpression(path) {
let found = false;
// Doing it like this we detect 'customElements.define()',
// but also 'window.customElements.define()'
path.traverse({
MemberExpression(memberPath) {
if (memberPath.parentPath !== path) {
return;
}
const { node } = memberPath;
if (node.object.name === 'customElements' && node.property.name === 'define') {
found = true;
}
if (
node.object.object &&
node.object.object.name === 'window' &&
node.object.property.name === 'customElements' &&
node.property.name === 'define'
) {
found = true;
}
},
});
if (found) {
let tagName;
let constructorIdentifier;
if (t.isLiteral(path.node.arguments[0])) {
tagName = path.node.arguments[0].value;
} else {
// No Literal found. For now, we only mark them as '[variable]'
tagName = '[variable]';
}
if (path.node.arguments[1].type === 'Identifier') {
constructorIdentifier = path.node.arguments[1].name;
} else {
// We assume customElements.define('my-el', class extends HTMLElement {...})
constructorIdentifier = '[inline]';
}
definitions.push({ tagName, constructorIdentifier, __tmp: { path } });
}
},
});
return definitions;
}
class FindCustomelementsAnalyzer extends Analyzer {
constructor() {
super();
this.name = 'find-customelements';
}
/**
* @desc Finds export specifiers and sources
* @param {FindCustomelementsConfig} customConfig
*/
async execute(customConfig = {}) {
const cfg = {
targetProjectPath: null,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
const projectPath = cfg.targetProjectPath;
const queryOutput = await this._traverse(async (ast, { relativePath }) => {
let transformedEntry = findCustomElementsPerAstEntry(ast);
transformedEntry = await trackdownRoot(transformedEntry, relativePath, projectPath);
transformedEntry = cleanup(transformedEntry);
return { result: transformedEntry };
});
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = FindCustomelementsAnalyzer;

View file

@ -0,0 +1,224 @@
/* eslint-disable no-shadow, no-param-reassign */
const pathLib = require('path');
const { default: traverse } = require('@babel/traverse');
const { Analyzer } = require('./helpers/Analyzer.js');
const { trackDownIdentifier } = require('./helpers/track-down-identifier.js');
const { normalizeSourcePaths } = require('./helpers/normalize-source-paths.js');
const { aForEach } = require('../utils/async-array-utils.js');
/** @typedef {import('./helpers/track-down-identifier.js').RootFile} RootFile */
/**
* @typedef {object} RootFileMapEntry
* @property {string} currentFileSpecifier this is the local name in the file we track from
* @property {RootFile} rootFile contains file(filePath) and specifier
*/
/**
* @typedef {RootFileMapEntry[]} RootFileMap
*/
async function trackdownRoot(transformedEntry, relativePath, projectPath) {
const fullCurrentFilePath = pathLib.resolve(projectPath, relativePath);
await aForEach(transformedEntry, async specObj => {
/** @type {RootFileMap} */
const rootFileMap = [];
if (specObj.exportSpecifiers[0] === '[file]') {
rootFileMap.push(undefined);
} else {
/**
* './src/origin.js': `export class MyComp {}`
* './index.js:' `export { MyComp as RenamedMyComp } from './src/origin'`
*
* Goes from specifier like 'RenamedMyComp' to object for rootFileMap like:
* {
* currentFileSpecifier: 'RenamedMyComp',
* rootFile: {
* file: './src/origin.js',
* specifier: 'MyCompDefinition',
* }
* }
*/
await aForEach(specObj.exportSpecifiers, async (/** @type {string} */ specifier) => {
let rootFile;
let localMapMatch;
if (specObj.localMap) {
localMapMatch = specObj.localMap.find(m => m.exported === specifier);
}
// TODO: find out if possible to use trackDownIdentifierFromScope
if (specObj.source) {
// TODO: see if still needed: && (localMapMatch || specifier === '[default]')
const importedIdentifier = (localMapMatch && localMapMatch.local) || specifier;
rootFile = await trackDownIdentifier(
specObj.source,
importedIdentifier,
fullCurrentFilePath,
projectPath,
);
/** @type {RootFileMapEntry} */
const entry = {
currentFileSpecifier: specifier,
rootFile,
};
rootFileMap.push(entry);
} else {
/** @type {RootFileMapEntry} */
const entry = {
currentFileSpecifier: specifier,
rootFile: { file: '[current]', specifier },
};
rootFileMap.push(entry);
}
});
}
specObj.rootFileMap = rootFileMap;
});
return transformedEntry;
}
function cleanup(transformedEntry) {
transformedEntry.forEach(specObj => {
if (specObj.__tmp) {
delete specObj.__tmp;
}
});
return transformedEntry;
}
/**
* @returns {string[]}
*/
function getExportSpecifiers(node) {
// handles default [export const g = 4];
if (node.declaration) {
if (node.declaration.declarations) {
return [node.declaration.declarations[0].id.name];
}
if (node.declaration.id) {
return [node.declaration.id.name];
}
}
// handles (re)named specifiers [export { x (as y)} from 'y'];
return node.specifiers.map(s => {
let specifier;
if (s.exported) {
// { x as y }
specifier = s.exported.name;
} else {
// { x }
specifier = s.local.name;
}
return specifier;
});
}
/**
* @returns {object[]}
*/
function getLocalNameSpecifiers(node) {
return node.specifiers
.map(s => {
if (s.exported && s.local && s.exported.name !== s.local.name) {
return {
local: s.local.name,
exported: s.exported.name,
};
}
return undefined;
})
.filter(s => s);
}
/**
* @desc Finds import specifiers and sources for a given ast result
* @param {BabelAst} ast
* @param {boolean} searchForFileImports
*/
function findExportsPerAstEntry(ast, searchForFileImports) {
// Visit AST...
const transformedEntry = [];
// Unfortunately, we cannot have async functions in babel traverse.
// Therefore, we store a temp reference to path that we use later for
// async post processing (tracking down original export Identifier)
traverse(ast, {
ExportNamedDeclaration(path) {
const exportSpecifiers = getExportSpecifiers(path.node);
const localMap = getLocalNameSpecifiers(path.node);
const source = path.node.source && path.node.source.value;
transformedEntry.push({ exportSpecifiers, localMap, source, __tmp: { path } });
},
ExportDefaultDeclaration(path) {
const exportSpecifiers = ['[default]'];
const source = path.node.declaration.name;
transformedEntry.push({ exportSpecifiers, source, __tmp: { path } });
},
});
if (searchForFileImports) {
// Always add an entry for just the file 'relativePath'
// (since this also can be imported directly from a search target project)
transformedEntry.push({
exportSpecifiers: ['[file]'],
// source: relativePath,
});
}
return transformedEntry;
}
class FindExportsAnalyzer extends Analyzer {
constructor() {
super();
this.name = 'find-exports';
}
/**
* @desc Finds export specifiers and sources
* @param {FindExportsConfig} customConfig
*/
async execute(customConfig = {}) {
/**
* @typedef FindExportsConfig
* @property {boolean} [onlyInternalSources=false]
* @property {{ [category]: (filePath) => boolean }} [customConfig.categories] object with
* categories as keys and (not necessarily mutually exlusive) functions that define a category
* @property {boolean} searchForFileImports Instead of only focusing on specifiers like
* [import {specifier} 'lion-based-ui/foo.js'], also list [import 'lion-based-ui/foo.js'] as a result
*/
const cfg = {
targetProjectPath: null,
metaConfig: null,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
const projectPath = cfg.targetProjectPath;
const queryOutput = await this._traverse(async (ast, { relativePath }) => {
let transformedEntry = findExportsPerAstEntry(ast, cfg, relativePath, projectPath);
transformedEntry = await normalizeSourcePaths(transformedEntry, relativePath, projectPath);
transformedEntry = await trackdownRoot(transformedEntry, relativePath, projectPath);
transformedEntry = cleanup(transformedEntry);
return { result: transformedEntry };
});
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = FindExportsAnalyzer;

View file

@ -0,0 +1,158 @@
/* eslint-disable no-shadow, no-param-reassign */
const { default: traverse } = require('@babel/traverse');
const { isRelativeSourcePath } = require('../utils/relative-source-path.js');
const { normalizeSourcePaths } = require('./helpers/normalize-source-paths.js');
const { Analyzer } = require('./helpers/Analyzer.js');
/**
* Options that allow to filter 'on a file basis'.
* We can also filter on the total result
*/
const /** @type {AnalyzerOptions} */ options = {
/**
* @desc Only leaves entries with external sources:
* - keeps: '@open-wc/testing'
* - drops: '../testing'
* @param {FindImportsAnalysisResult} result
* @param {string} targetSpecifier for instance 'LitElement'
*/
onlyExternalSources(result) {
return result.filter(entry => !isRelativeSourcePath(entry.source));
},
};
function getImportOrReexportsSpecifiers(node) {
return node.specifiers.map(s => {
if (s.type === 'ImportDefaultSpecifier' || s.type === 'ExportDefaultSpecifier') {
return '[default]';
}
if (s.type === 'ImportNamespaceSpecifier' || s.type === 'ExportNamespaceSpecifier') {
return '[*]';
}
if ((s.imported && s.type === 'ImportNamespaceSpecifier') || s.type === 'ImportSpecifier') {
return s.imported.name;
}
if (s.exported && s.type === 'ExportNamespaceSpecifier') {
return s.exported.name;
}
return s.local.name;
});
}
/**
* @desc Finds import specifiers and sources
* @param {BabelAst} ast
* @param {string} context.relativePath the file being currently processed
*/
function findImportsPerAstEntry(ast) {
// Visit AST...
const transformedEntry = [];
traverse(ast, {
ImportDeclaration(path) {
const importSpecifiers = getImportOrReexportsSpecifiers(path.node);
if (importSpecifiers.length === 0) {
importSpecifiers.push('[file]'); // apparently, there was just a file import
}
const source = path.node.source.value;
transformedEntry.push({ importSpecifiers, source });
},
// Dynamic imports
CallExpression(path) {
if (path.node.callee && path.node.callee.type === 'Import') {
// TODO: check for specifiers catched via obj destructuring?
// TODO: also check for ['file']
const importSpecifiers = ['[default]'];
let source = path.node.arguments[0].value;
if (!source) {
// TODO: with advanced retrieval, we could possibly get the value
source = '[variable]';
}
transformedEntry.push({ importSpecifiers, source });
}
},
ExportNamedDeclaration(path) {
if (!path.node.source) {
return; // we are dealing with a regular export, not a reexport
}
const importSpecifiers = getImportOrReexportsSpecifiers(path.node);
const source = path.node.source.value;
transformedEntry.push({ importSpecifiers, source });
},
// ExportAllDeclaration(path) {
// if (!path.node.source) {
// return; // we are dealing with a regular export, not a reexport
// }
// const importSpecifiers = ['[*]'];
// const source = path.node.source.value;
// transformedEntry.push({ importSpecifiers, source });
// },
});
return transformedEntry;
}
class FindImportsAnalyzer extends Analyzer {
constructor() {
super();
this.name = 'find-imports';
}
/**
* @desc Finds import specifiers and sources
* @param {FindImportsConfig} customConfig
*/
async execute(customConfig = {}) {
/**
* @typedef FindImportsConfig
* @property {boolean} [keepInternalSources=false] by default, relative paths like '../x.js' are
* filtered out. This option keeps them.
* means that 'external-dep/file' will be resolved to 'external-dep/file.js' will both be stored
* as the latter
*/
const cfg = {
targetProjectPath: null,
// post process file
keepInternalSources: false,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
const queryOutput = await this._traverse(async (ast, { relativePath }) => {
let transformedEntry = findImportsPerAstEntry(ast);
// Post processing based on configuration...
transformedEntry = await normalizeSourcePaths(
transformedEntry,
relativePath,
cfg.targetProjectPath,
);
if (!cfg.keepInternalSources) {
transformedEntry = options.onlyExternalSources(transformedEntry);
}
return { result: transformedEntry };
});
// if (cfg.sortBySpecifier) {
// queryOutput = sortBySpecifier.execute(queryOutput, {
// ...cfg,
// specifiersKey: 'importSpecifiers',
// });
// }
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = FindImportsAnalyzer;

View file

@ -0,0 +1,257 @@
/* eslint-disable no-param-reassign */
const fs = require('fs');
const semver = require('semver');
const pathLib = require('path');
const { LogService } = require('../../services/LogService.js');
const { QueryService } = require('../../services/QueryService.js');
const { ReportService } = require('../../services/ReportService.js');
const { InputDataService } = require('../../services/InputDataService.js');
const { aForEach } = require('../../utils/async-array-utils.js');
const { getFilePathRelativeFromRoot } = require('../../utils/get-file-path-relative-from-root.js');
/**
* @desc Gets a cached result from ReportService. Since ReportService slightly modifies analyzer
* output, we 'unwind' before we return...
* @param {object} config
* @param {string} config.analyzerName
* @param {string} config.identifier
*/
function getCachedAnalyzerResult({ analyzerName, identifier }) {
const cachedResult = ReportService.getCachedResult({ analyzerName, identifier });
if (!cachedResult) {
return;
}
LogService.success(`cached version found for ${identifier}`);
const { queryOutput } = cachedResult;
const { analyzerMeta } = cachedResult.meta;
analyzerMeta.__fromCache = true;
return { analyzerMeta, queryOutput }; // eslint-disable-line consistent-return
}
/**
* @desc analyzes one entry: the callback can traverse a given ast for each entry
* @param {AstDataProject[]} astDataProjects
* @param {function} astAnalysis
*/
async function analyzePerAstEntry(projectData, astAnalysis) {
const entries = [];
await aForEach(projectData.entries, async ({ file, ast, context: astContext }) => {
const relativePath = getFilePathRelativeFromRoot(file, projectData.project.path);
const context = { code: astContext.code, relativePath, projectData };
LogService.debug(`${pathLib.resolve(projectData.project.path, file)}`);
const { result, meta } = await astAnalysis(ast, context);
entries.push({ file: relativePath, meta, result });
});
const filteredEntries = entries.filter(({ result }) => Boolean(result.length));
return filteredEntries;
}
/**
* @desc This method ensures that the result returned by an analyzer always has a consitent format,
* By returning the configuration for the queryOutput, it will be possible to run later queries
* under the same circumstances
* @param {array} queryOutput
* @param {object} configuration
* @param {object} analyzer
*/
function ensureAnalyzerResultFormat(queryOutput, configuration, analyzer) {
const { targetProjectMeta, identifier, referenceProjectMeta } = analyzer;
const optional = {};
if (targetProjectMeta) {
optional.targetProject = targetProjectMeta;
delete optional.targetProject.path; // get rid of machine specific info
}
if (referenceProjectMeta) {
optional.referenceProject = referenceProjectMeta;
delete optional.referenceProject.path; // get rid of machine specific info
}
/** @type {AnalyzerResult} */
const aResult = {
queryOutput,
analyzerMeta: {
name: analyzer.name,
requiredAst: analyzer.requiredAst,
identifier,
...optional,
configuration,
},
};
// For now, delete data relatable to local machine + path data that will recognize
// projX#v1 (via rootA/projX#v1, rootB/projX#v2) as identical entities.
// Cleaning up local data paths will make sure their hashes will be similar
// across different machines
delete aResult.analyzerMeta.configuration.referenceProjectPath;
delete aResult.analyzerMeta.configuration.targetProjectPath;
if (Array.isArray(aResult.queryOutput)) {
aResult.queryOutput.forEach(projectOutput => {
if (projectOutput.project) {
delete projectOutput.project.path;
}
});
}
return aResult;
}
/**
* @desc Before running the analyzer, we need two conditions for a 'compatible match':
* 1. referenceProject is imported by targetProject at all
* 2. referenceProject and targetProject have compatible major versions
* @param {string} referencePath
* @param {string} targetPath
*/
function checkForMatchCompatibility(referencePath, targetPath) {
const refFile = pathLib.resolve(referencePath, 'package.json');
const referencePkg = JSON.parse(fs.readFileSync(refFile, 'utf8'));
const targetFile = pathLib.resolve(targetPath, 'package.json');
const targetPkg = JSON.parse(fs.readFileSync(targetFile, 'utf8'));
const allTargetDeps = [
...Object.entries(targetPkg.devDependencies || {}),
...Object.entries(targetPkg.dependencies || {}),
];
const importEntry = allTargetDeps.find(([name]) => referencePkg.name === name);
if (!importEntry) {
return { compatible: false, reason: 'no-dependency' };
}
if (!semver.satisfies(referencePkg.version, importEntry[1])) {
return { compatible: false, reason: 'no-matched-version' };
}
return { compatible: true };
}
class Analyzer {
constructor() {
this.requiredAst = 'babel';
}
static get requiresReference() {
return false;
}
/**
* @param {AnalyzerConfig} cfg
* @returns {CachedAnalyzerResult|undefined}
*/
_prepare(cfg) {
this.targetProjectMeta = InputDataService.getProjectMeta(cfg.targetProjectPath, true);
if (cfg.referenceProjectPath) {
this.referenceProjectMeta = InputDataService.getProjectMeta(cfg.referenceProjectPath, true);
}
/**
* Create a unique hash based on target, reference and configuration
*/
this.identifier = ReportService.createIdentifier({
targetProject: this.targetProjectMeta,
referenceProject: this.referenceProjectMeta,
analyzerConfig: cfg,
});
if (cfg.referenceProjectPath) {
this.referenceProjectMeta = InputDataService.getProjectMeta(cfg.referenceProjectPath, true);
const { compatible, reason } = checkForMatchCompatibility(
cfg.referenceProjectPath,
cfg.targetProjectPath,
);
if (!compatible) {
LogService.info(
`skipping ${LogService.pad(this.name, 16)} for ${
this.identifier
}: (${reason})\n${cfg.targetProjectPath.replace(
'/Users/hu84jr/git/providence/providence-input-data/search-targets/',
'',
)}`,
);
return ensureAnalyzerResultFormat(`[${reason}]`, cfg, this);
}
}
/**
* See if we maybe already have our result in cache in the file-system.
*/
const cachedResult = getCachedAnalyzerResult({
analyzerName: this.name,
identifier: this.identifier,
});
if (cachedResult) {
return cachedResult;
}
LogService.info(`starting ${LogService.pad(this.name, 16)} for ${this.identifier}`);
/**
* Get reference and search-target data
*/
this.targetData = InputDataService.createDataObject(
[cfg.targetProjectPath],
cfg.gatherFilesConfig,
);
if (cfg.referenceProjectPath) {
this.referenceData = InputDataService.createDataObject(
[cfg.referenceProjectPath],
cfg.gatherFilesConfigReference || cfg.gatherFilesConfig,
);
}
return undefined;
}
/**
* @param {QueryOutput} queryOutput
* @param {AnalyzerConfig} cfg
* @returns {AnalyzerResult}
*/
_finalize(queryOutput, cfg) {
const analyzerResult = ensureAnalyzerResultFormat(queryOutput, cfg, this);
LogService.success(`finished ${LogService.pad(this.name, 16)} for ${this.identifier}`);
return analyzerResult;
}
/**
* @param {function} traverseEntry
*/
async _traverse(traverseEntry) {
/**
* Create ASTs for our inputData
*/
const astDataProjects = await QueryService.addAstToProjectsData(this.targetData, 'babel');
return analyzePerAstEntry(astDataProjects[0], traverseEntry);
}
async execute(customConfig = {}) {
const cfg = {
targetProjectPath: null,
referenceProjectPath: null,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
const queryOutput = await this._traverse(() => {});
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = { Analyzer };

View file

@ -0,0 +1,75 @@
const fs = require('fs');
const pathLib = require('path');
const { isRelativeSourcePath } = require('../../utils/relative-source-path.js');
const { LogService } = require('../../services/LogService.js');
/**
* TODO: Use utils/resolve-import-path for 100% accuracy
*
* - from: 'reference-project/foo.js'
* - to: './foo.js'
* When we need to resolve to the main entry:
* - from: 'reference-project'
* - to: './index.js' (or other file specified in package.json 'main')
* @param {object} config
* @param {string} config.requestedExternalSource
* @param {{name, mainEntry}} config.externalProjectMeta
* @param {string} config.externalRootPath
* @returns {string|null}
*/
function fromImportToExportPerspective({
requestedExternalSource,
externalProjectMeta,
externalRootPath,
}) {
if (isRelativeSourcePath(requestedExternalSource)) {
LogService.warn('[fromImportToExportPerspective] Please only provide external import paths');
return null;
}
const scopedProject = requestedExternalSource[0] === '@';
// 'external-project/src/file.js' -> ['external-project', 'src', file.js']
let splitSource = requestedExternalSource.split('/');
if (scopedProject) {
// '@external/project'
splitSource = [splitSource.slice(0, 2).join('/'), ...splitSource.slice(2)];
}
// ['external-project', 'src', 'file.js'] -> 'external-project'
const project = splitSource.slice(0, 1).join('/');
// ['external-project', 'src', 'file.js'] -> 'src/file.js'
const localPath = splitSource.slice(1).join('/');
if (externalProjectMeta.name !== project) {
return null;
}
if (localPath) {
// like '@open-wc/x/y.js'
// Now, we need to resolve to a file or path. Even though a path can contain '.',
// we still need to check if we're not dealing with a folder.
// - '@open-wc/x/y.js' -> '@open-wc/x/y.js' or... '@open-wc/x/y.js/index.js' ?
// - or 'lion-based-ui/test' -> 'lion-based-ui/test/index.js' or 'lion-based-ui/test' ?
const pathToCheck = pathLib.resolve(externalRootPath, `./${localPath}`);
if (fs.existsSync(pathToCheck)) {
const stat = fs.statSync(pathToCheck);
if (stat && stat.isFile()) {
return `./${localPath}`; // '/path/to/lion-based-ui/fol.der' is a file
}
return `./${localPath}/index.js`; // '/path/to/lion-based-ui/fol.der' is a folder
// eslint-disable-next-line no-else-return
} else if (fs.existsSync(`${pathToCheck}.js`)) {
return `./${localPath}.js`; // '/path/to/lion-based-ui/fol.der' is file '/path/to/lion-based-ui/fol.der.js'
}
} else {
// like '@lion/core'
let mainEntry = externalProjectMeta.mainEntry || 'index.js';
if (!mainEntry.startsWith('./')) {
mainEntry = `./${mainEntry}`;
}
return mainEntry;
}
return null;
}
module.exports = { fromImportToExportPerspective };

View file

@ -0,0 +1,54 @@
/* eslint-disable no-param-reassign */
const pathLib = require('path');
const {
isRelativeSourcePath,
// toRelativeSourcePath,
} = require('../../utils/relative-source-path.js');
const { resolveImportPath } = require('../../utils/resolve-import-path.js');
const { aMap } = require('../../utils/async-array-utils.js');
function toLocalPath(currentDirPath, resolvedPath) {
let relativeSourcePath = pathLib.relative(currentDirPath, resolvedPath);
if (!relativeSourcePath.startsWith('.')) {
// correction on top of pathLib.resolve, which resolves local paths like
// (from import perspective) external modules.
// so 'my-local-files.js' -> './my-local-files.js'
relativeSourcePath = `./${relativeSourcePath}`;
}
return relativeSourcePath;
}
/**
* @desc Resolves and converts to normalized local/absolute path, based on file-system information.
* - from: { source: '../../relative/file' }
* - to: {
* fullPath: './absolute/path/from/root/to/relative/file.js',
* normalizedPath: '../../relative/file.js'
* }
* @param {FindImportsAnalysisResult} result
* @param {string} result
* @param {string} relativePath
* @returns {string} a relative path from root (usually a project) or an external path like 'lion-based-ui/x.js'
*/
async function normalizeSourcePaths(queryOutput, relativePath, rootPath = process.cwd()) {
const currentFilePath = pathLib.resolve(rootPath, relativePath);
const currentDirPath = pathLib.dirname(currentFilePath);
return aMap(queryOutput, async specifierResObj => {
if (specifierResObj.source) {
if (isRelativeSourcePath(specifierResObj.source) && relativePath) {
// This will be a source like '../my/file.js' or './file.js'
const resolvedPath = await resolveImportPath(specifierResObj.source, currentFilePath);
specifierResObj.normalizedSource =
resolvedPath && toLocalPath(currentDirPath, resolvedPath);
// specifierResObj.fullSource = resolvedPath && toRelativeSourcePath(resolvedPath, rootPath);
} else {
// This will be a source from a project, like 'lion-based-ui/x.js' or '@open-wc/testing/y.js'
specifierResObj.normalizedSource = specifierResObj.source;
// specifierResObj.fullSource = specifierResObj.source;
}
}
return specifierResObj;
});
}
module.exports = { normalizeSourcePaths };

View file

@ -0,0 +1,268 @@
const fs = require('fs');
const { default: traverse } = require('@babel/traverse');
const {
isRelativeSourcePath,
toRelativeSourcePath,
} = require('../../utils/relative-source-path.js');
const { resolveImportPath } = require('../../utils/resolve-import-path.js');
const { AstService } = require('../../services/AstService.js');
const { LogService } = require('../../services/LogService.js');
const { memoizeAsync } = require('../../utils/memoize.js');
// TODO: memoize trackDownIdentifierFromScope (we can do so if tests are not mocked under same
// filesystem paths)
/** @typedef {import('./types').RootFile} RootFile */
/**
* Other than with import, no binding is created for MyClass by Babel(?)
* This means 'path.scope.getBinding('MyClass')' returns undefined
* and we have to find a different way to retrieve this value.
* @param {object} astPath Babel ast traversal path
* @param {string} identifierName the name that should be tracked (and that exists inside scope of astPath)
*/
function getBindingAndSourceReexports(astPath, identifierName) {
// Get to root node of file and look for exports like `export { identifierName } from 'src';`
let source;
let bindingType;
let bindingPath;
let curPath = astPath;
while (curPath.parentPath) {
curPath = curPath.parentPath;
}
const rootPath = curPath;
rootPath.traverse({
ExportSpecifier(path) {
// eslint-disable-next-line arrow-body-style
const found =
path.node.exported.name === identifierName || path.node.local.name === identifierName;
if (found) {
bindingPath = path;
bindingType = 'ExportSpecifier';
source = path.parentPath.node.source.value;
path.stop();
}
},
});
return [source, bindingType, bindingPath];
}
/**
* @desc returns source and importedIdentifierName: We might be an import that was locally renamed.
* Since we are traversing, we are interested in the imported name. Or in case of a re-export,
* the local name.
* @param {object} astPath Babel ast traversal path
* @param {string} identifierName the name that should be tracked (and that exists inside scope of astPath)
*/
function getImportSourceFromAst(astPath, identifierName) {
let source;
let importedIdentifierName;
const binding = astPath.scope.getBinding(identifierName);
let bindingType = binding && binding.path.type;
let bindingPath = binding && binding.path;
const matchingTypes = ['ImportSpecifier', 'ImportDefaultSpecifier', 'ExportSpecifier'];
if (binding && matchingTypes.includes(bindingType)) {
source = binding.path.parentPath.node.source.value;
} else {
// no binding
[source, bindingType, bindingPath] = getBindingAndSourceReexports(astPath, identifierName);
}
const shouldLookForDefaultExport = bindingType === 'ImportDefaultSpecifier';
if (shouldLookForDefaultExport) {
importedIdentifierName = '[default]';
} else if (source) {
const { node } = bindingPath;
importedIdentifierName = (node.imported && node.imported.name) || node.local.name;
}
return { source, importedIdentifierName };
}
let trackDownIdentifier;
/**
* @example
*```js
* // 1. Starting point
* // target-proj/my-comp-import.js
* import { MyComp as TargetComp } from 'ref-proj';
*
* // 2. Intermediate stop: a re-export
* // ref-proj/exportsIndex.js (package.json has main: './exportsIndex.js')
* export { RefComp as MyComp } from './src/RefComp.js';
*
* // 3. End point: our declaration
* // ref-proj/src/RefComp.js
* export class RefComp extends LitElement {...}
*```
*
* @param {string} source an importSpecifier source, like 'ref-proj' or '../file'
* @param {string} identifierName imported reference/Identifier name, like 'MyComp'
* @param {string} currentFilePath file path, like '/path/to/target-proj/my-comp-import.js'
* @param {string} rootPath dir path, like '/path/to/target-proj'
* @returns {object} file: path of file containing the binding (exported declaration),
* like '/path/to/ref-proj/src/RefComp.js'
*/
async function trackDownIdentifierFn(source, identifierName, currentFilePath, rootPath, depth = 0) {
let rootFilePath; // our result path
let rootSpecifier; // the name under which it was imported
if (!isRelativeSourcePath(source)) {
// So, it is an external ref like '@lion/core' or '@open-wc/scoped-elements/index.js'
// At this moment in time, we don't know if we have file system access to this particular
// project. Therefore, we limit ourselves to tracking down local references.
// In case this helper is used inside an analyzer like 'match-subclasses', the external
// (search-target) project can be accessed and paths can be resolved to local ones,
// just like in 'match-imports' analyzer.
/** @type {RootFile} */
const result = { file: source, specifier: identifierName };
return result;
}
/**
* @prop resolvedSourcePath
* @type {string}
* @example resolveImportPath('../file') // => '/path/to/target-proj/file.js'
*/
const resolvedSourcePath = await resolveImportPath(source, currentFilePath);
LogService.debug(`[trackDownIdentifier] ${resolvedSourcePath}`);
const code = fs.readFileSync(resolvedSourcePath, 'utf8');
const ast = AstService.getAst(code, 'babel', { filePath: resolvedSourcePath });
const shouldLookForDefaultExport = identifierName === '[default]';
let reexportMatch = null; // named specifier declaration
let pendingTrackDownPromise;
traverse(ast, {
ExportDefaultDeclaration(path) {
if (!shouldLookForDefaultExport) {
return;
}
let newSource;
if (path.node.declaration.type === 'Identifier') {
newSource = getImportSourceFromAst(path, path.node.declaration.name).source;
}
if (newSource) {
pendingTrackDownPromise = trackDownIdentifier(
newSource,
'[default]',
resolvedSourcePath,
rootPath,
depth + 1,
);
} else {
// We found our file!
rootSpecifier = identifierName;
rootFilePath = toRelativeSourcePath(resolvedSourcePath, rootPath);
}
path.stop();
},
ExportNamedDeclaration: {
enter(path) {
if (reexportMatch || shouldLookForDefaultExport) {
return;
}
// Are we dealing with a re-export ?
if (path.node.specifiers && path.node.specifiers.length) {
const exportMatch = path.node.specifiers.find(s => s.exported.name === identifierName);
if (exportMatch) {
const localName = exportMatch.local.name;
let newSource;
if (path.node.source) {
/**
* @example
* export { x } from 'y'
*/
newSource = path.node.source.value;
} else {
/**
* @example
* import { x } from 'y'
* export { x }
*/
newSource = getImportSourceFromAst(path, identifierName).source;
if (!newSource) {
/**
* @example
* const x = 12;
* export { x }
*/
return;
}
}
reexportMatch = true;
pendingTrackDownPromise = trackDownIdentifier(
newSource,
localName,
resolvedSourcePath,
rootPath,
depth + 1,
);
path.stop();
}
}
},
exit(path) {
if (!reexportMatch) {
// We didn't find a re-exported Identifier, that means the reference is declared
// in current file...
rootSpecifier = identifierName;
rootFilePath = toRelativeSourcePath(resolvedSourcePath, rootPath);
path.stop();
}
},
},
});
if (pendingTrackDownPromise) {
// We can't handle promises inside Babel traverse, so we do it here...
const resObj = await pendingTrackDownPromise;
rootFilePath = resObj.file;
rootSpecifier = resObj.specifier;
}
return /** @type {RootFile } */ { file: rootFilePath, specifier: rootSpecifier };
}
trackDownIdentifier = memoizeAsync(trackDownIdentifierFn);
/**
* @param {BabelPath} astPath
* @param {string} identifierNameInScope
* @param {string} fullCurrentFilePath
* @param {string} projectPath
*/
async function trackDownIdentifierFromScope(
astPath,
identifierNameInScope,
fullCurrentFilePath,
projectPath,
) {
const sourceObj = getImportSourceFromAst(astPath, identifierNameInScope);
/** @type {RootFile} */
let rootFile;
if (sourceObj.source) {
rootFile = await trackDownIdentifier(
sourceObj.source,
sourceObj.importedIdentifierName,
fullCurrentFilePath,
projectPath,
);
} else {
const specifier = sourceObj.importedIdentifierName || identifierNameInScope;
rootFile = { file: '[current]', specifier };
}
return rootFile;
}
module.exports = {
trackDownIdentifier,
getImportSourceFromAst,
trackDownIdentifierFromScope,
};

View file

@ -0,0 +1,243 @@
/* eslint-disable no-shadow, no-param-reassign */
const FindImportsAnalyzer = require('./find-imports.js');
const FindExportsAnalyzer = require('./find-exports.js');
const { Analyzer } = require('./helpers/Analyzer.js');
const { fromImportToExportPerspective } = require('./helpers/from-import-to-export-perspective.js');
/**
* @desc Helper method for matchImportsPostprocess. Modifies its resultsObj
* @param {object} resultsObj
* @param {string} exportId like 'myExport::./reference-project/my/export.js::my-project'
* @param {Set<string>} filteredList
*/
function storeResult(resultsObj, exportId, filteredList, meta) {
if (!resultsObj[exportId]) {
// eslint-disable-next-line no-param-reassign
resultsObj[exportId] = { meta };
}
// eslint-disable-next-line no-param-reassign
resultsObj[exportId].files = [...(resultsObj[exportId].files || []), ...Array.from(filteredList)];
}
/**
* @param {FindExportsAnalyzerResult} exportsAnalyzerResult
* @param {FindImportsAnalyzerResult} importsAnalyzerResult
* @param {matchImportsConfig} customConfig
* @returns {AnalyzerResult}
*/
function matchImportsPostprocess(exportsAnalyzerResult, importsAnalyzerResult, customConfig) {
const cfg = {
...customConfig,
};
/**
* Step 1: a 'flat' data structure
* @desc Create a key value storage map for exports/imports matches
* - key: `${exportSpecifier}::${normalizedSource}::${project}` from reference project
* - value: an array of import file matches like `${targetProject}::${normalizedSource}`
* @example
* {
* 'myExport::./reference-project/my/export.js::my-project' : {
* meta: {...},
* files: [
* 'target-project-a::./import/file.js',
* 'target-project-b::./another/import/file.js'
* ],
* ]}
* }
*/
const resultsObj = {};
exportsAnalyzerResult.queryOutput.forEach(exportEntry => {
const exportsProjectObj = exportsAnalyzerResult.analyzerMeta.targetProject;
// Look for all specifiers that are exported, like [import {specifier} 'lion-based-ui/foo.js']
exportEntry.result.forEach(exportEntryResult => {
if (!exportEntryResult.exportSpecifiers) {
return;
}
exportEntryResult.exportSpecifiers.forEach(exportSpecifier => {
// Get all unique imports (name::source::project combinations) that match current exportSpecifier
const filteredImportsList = new Set();
const exportId = `${exportSpecifier}::${exportEntry.file}::${exportsProjectObj.name}`;
// eslint-disable-next-line no-shadow
// importsAnalyzerResult.queryOutput.forEach(({ entries, project }) => {
const importProject = importsAnalyzerResult.analyzerMeta.targetProject.name;
importsAnalyzerResult.queryOutput.forEach(({ result, file }) =>
result.forEach(importEntryResult => {
/**
* @example
* Example context (read by 'find-imports'/'find-exports' analyzers)
* - export (/folder/exporting-file.js):
* `export const x = 'foo'`
* - import (target-project-a/importing-file.js):
* `import { x, y } from '@reference-repo/folder/exporting-file.js'`
* Example variables (extracted by 'find-imports'/'find-exports' analyzers)
* - exportSpecifier: 'x'
* - importSpecifiers: ['x', 'y']
*/
const hasExportSpecifierImported =
// ['x', 'y'].includes('x')
importEntryResult.importSpecifiers.includes(exportSpecifier) ||
importEntryResult.importSpecifiers.includes('[*]');
/**
* @example
* exportFile './foo.js'
* => export const z = 'bar'
* importFile 'importing-target-project/file.js'
* => import { z } from '@reference/foo.js'
*/
const isFromSameSource =
exportEntry.file ===
fromImportToExportPerspective({
requestedExternalSource: importEntryResult.normalizedSource,
externalProjectMeta: exportsProjectObj,
externalRootPath: cfg.referenceProjectPath,
});
// TODO: transitive deps recognition. Could also be distinct post processor
// // export { z } from '../foo.js'
// // import { z } from '@reference/foo.js'
// (exportEntryResult.normalizedSource === importEntryResult.normalizedSource)
if (hasExportSpecifierImported && isFromSameSource) {
filteredImportsList.add(`${importProject}::${file}`);
}
}),
);
storeResult(resultsObj, exportId, filteredImportsList, exportEntry.meta);
});
});
});
/**
* Step 2: a rich data structure
* @desc Transform resultObj from step 1 into an array of objects
* @example
* [{
* exportSpecifier: {
* // name under which it is registered in npm ("name" attr in package.json)
* name: 'RefClass',
* project: 'exporting-ref-project',
* filePath: './ref-src/core.js',
* id: 'RefClass::ref-src/core.js::exporting-ref-project',
* meta: {...},
*
* // most likely via post processor
* },
* // All the matched targets (files importing the specifier), ordered per project
* matchesPerProject: [
* {
* project: 'importing-target-project',
* files: [
* './target-src/indirect-imports.js',
* ...
* ],
* },
* ...
* ],
* }]
*/
const resultsArray = Object.entries(resultsObj)
.map(([id, flatResult]) => {
const [exportSpecifierName, filePath, project] = id.split('::');
const { meta } = flatResult;
const exportSpecifier = {
name: exportSpecifierName,
project,
filePath,
id,
...(meta || {}),
};
const matchesPerProject = [];
flatResult.files.forEach(projectFile => {
// eslint-disable-next-line no-shadow
const [project, file] = projectFile.split('::');
let projectEntry = matchesPerProject.find(m => m.project === project);
if (!projectEntry) {
matchesPerProject.push({ project, files: [] });
projectEntry = matchesPerProject[matchesPerProject.length - 1];
}
projectEntry.files.push(file);
});
return {
exportSpecifier,
matchesPerProject,
};
})
.filter(r => Object.keys(r.matchesPerProject).length);
return /** @type {AnalyzerResult} */ resultsArray;
}
class MatchImportsAnalyzer extends Analyzer {
constructor() {
super();
this.name = 'match-imports';
}
static get requiresReference() {
return true;
}
/**
* @desc Based on ExportsAnalyzerResult of reference project(s) (for instance lion-based-ui)
* and ImportsAnalyzerResult of search-targets (for instance my-app-using-lion-based-ui),
* an overview is returned of all matching imports and exports.
* @param {MatchImportsConfig} customConfig
*/
async execute(customConfig = {}) {
/**
* @typedef MatchImportsConfig
* @property {FindExportsConfig} [exportsConfig] These will be used when no exportsAnalyzerResult
* is provided (recommended way)
* @property {FindImportsConfig} [importsConfig]
* @property {GatherFilesConfig} [gatherFilesConfig]
* @property {array} [referenceProjectPath] reference paths
* @property {array} [targetProjectPath] search target paths
*/
const cfg = {
gatherFilesConfig: {},
referenceProjectPath: null,
targetProjectPath: null,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
const findExportsAnalyzer = new FindExportsAnalyzer();
const exportsAnalyzerResult = await findExportsAnalyzer.execute({
metaConfig: cfg.metaConfig,
targetProjectPath: cfg.referenceProjectPath,
});
const findImportsAnalyzer = new FindImportsAnalyzer();
const importsAnalyzerResult = await findImportsAnalyzer.execute({
metaConfig: cfg.metaConfig,
targetProjectPath: cfg.targetProjectPath,
});
const queryOutput = matchImportsPostprocess(exportsAnalyzerResult, importsAnalyzerResult, cfg);
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = MatchImportsAnalyzer;

View file

@ -0,0 +1,504 @@
/* eslint-disable no-shadow, no-param-reassign */
const MatchSubclassesAnalyzer = require('./match-subclasses.js');
const FindExportsAnalyzer = require('./find-exports.js');
const FindCustomelementsAnalyzer = require('./find-customelements.js');
const { Analyzer } = require('./helpers/Analyzer.js');
/** @typedef {import('./types').FindExportsAnalyzerResult} FindExportsAnalyzerResult */
/** @typedef {import('./types').FindCustomelementsAnalyzerResult} FindCustomelementsAnalyzerResult */
/** @typedef {import('./types').MatchSubclassesAnalyzerResult} MatchSubclassesAnalyzerResult */
/** @typedef {import('./types').FindImportsAnalyzerResult} FindImportsAnalyzerResult */
/** @typedef {import('./types').MatchedExportSpecifier} MatchedExportSpecifier */
/** @typedef {import('./types').RootFile} RootFile */
/**
* For prefix `{ from: 'lion', to: 'wolf' }`
*
* Keeps
* - WolfCheckbox (extended from LionCheckbox)
* - wolf-checkbox (extended from lion-checkbox)
*
* Removes
* - SheepCheckbox (extended from LionCheckbox)
* - WolfTickButton (extended from LionCheckbox)
* - sheep-checkbox (extended from lion-checkbox)
* - etc...
* @param {MatchPathsAnalyzerOutputFile[]} queryOutput
* @param {{from:string, to:string}} prefix
*/
function filterPrefixMatches(queryOutput, prefix) {
const capitalize = prefix => `${prefix[0].toUpperCase()}${prefix.slice(1)}`;
const filteredQueryOutput = queryOutput
.map(e => {
let keepVariable = false;
let keepTag = false;
if (e.variable) {
const fromUnprefixed = e.variable.from.replace(capitalize(prefix.from), '');
const toUnprefixed = e.variable.to.replace(capitalize(prefix.to), '');
keepVariable = fromUnprefixed === toUnprefixed;
}
if (e.tag) {
const fromUnprefixed = e.tag.from.replace(prefix.from, '');
const toUnprefixed = e.tag.to.replace(prefix.to, '');
keepTag = fromUnprefixed === toUnprefixed;
}
return {
name: e.name,
...(keepVariable ? { variable: e.variable } : {}),
...(keepTag ? { tag: e.tag } : {}),
};
})
.filter(e => e.tag || e.variable);
return filteredQueryOutput;
}
/**
*
* @param {MatchedExportSpecifier} matchSubclassesExportSpecifier
* @param {FindExportsAnalyzerResult} refFindExportsResult
* @returns {RootFile|undefined}
*/
function getExportSpecifierRootFile(matchSubclassesExportSpecifier, refFindExportsResult) {
/* eslint-disable arrow-body-style */
/** @type {RootFile} */
let rootFile;
refFindExportsResult.queryOutput.some(exportEntry => {
return exportEntry.result.some(exportEntryResult => {
if (!exportEntryResult.exportSpecifiers) {
return false;
}
/** @type {RootFile} */
exportEntryResult.exportSpecifiers.some(exportSpecifierString => {
const { name, filePath } = matchSubclassesExportSpecifier;
if (name === exportSpecifierString && filePath === exportEntry.file) {
const entry = exportEntryResult.rootFileMap.find(
rfEntry => rfEntry.currentFileSpecifier === name,
);
if (entry) {
rootFile = entry.rootFile;
if (rootFile.file === '[current]') {
rootFile.file = filePath;
}
}
}
return false;
});
return Boolean(rootFile);
});
});
return rootFile;
/* eslint-enable arrow-body-style */
}
function getClosestToRootTargetPath(targetPaths, targetExportsResult) {
let targetPath;
const { mainEntry } = targetExportsResult.analyzerMeta.targetProject;
if (targetPaths.includes(mainEntry)) {
targetPath = mainEntry;
} else {
// sort targetPaths: paths closest to root 'win'
[targetPath] = targetPaths.sort((a, b) => a.split('/').length - b.split('/').length);
}
return targetPath;
}
/**
*
* @param {FindExportsAnalyzerResult} targetExportsResult
* @param {FindExportsAnalyzerResult} refFindExportsResult
* @param {string} targetMatchedFile file where `toClass` from match-subclasses is defined
* @param {string} fromClass Identifier exported by reference project, for instance LionCheckbox
* @param {string} toClass Identifier exported by target project, for instance WolfCheckbox
* @param {string} refProjectName for instance @lion/checkbox
*/
function getVariablePaths(
targetExportsResult,
refFindExportsResult,
targetMatchedFile,
fromClass,
toClass,
refProjectName,
) {
/* eslint-disable arrow-body-style */
/**
* finds all paths that export WolfCheckbox
* @example ['./src/WolfCheckbox.js', './index.js']
* @type {string[]}
*/
const targetPaths = [];
targetExportsResult.queryOutput.forEach(({ file: targetExportsFile, result }) => {
// Find the FindExportAnalyzerEntry with the same rootFile as the rootPath of match-subclasses
// (targetMatchedFile)
const targetPathMatch = result.find(targetExportsEntry => {
return targetExportsEntry.rootFileMap.find(rootFileMapEntry => {
if (!rootFileMapEntry) {
return false;
}
const { rootFile } = rootFileMapEntry;
if (rootFile.specifier !== toClass) {
return false;
}
if (rootFile.file === '[current]') {
return targetExportsFile === targetMatchedFile;
}
return rootFile.file === targetMatchedFile;
});
});
if (targetPathMatch) {
targetPaths.push(targetExportsFile);
}
});
if (!targetPaths.length) {
return undefined; // there would be nothing to replace
}
const targetPath = getClosestToRootTargetPath(targetPaths, targetExportsResult);
// [A3]
/**
* finds all paths that import LionCheckbox
* @example ['./packages/checkbox/src/LionCheckbox.js', './index.js']
* @type {string[]}
*/
const refPaths = [];
refFindExportsResult.queryOutput.forEach(({ file, result }) => {
const refPathMatch = result.find(entry => {
if (entry.exportSpecifiers.includes(fromClass)) {
return true;
}
// if we're dealing with `export {x as y}`...
if (entry.localMap && entry.localMap.find(({ exported }) => exported === fromClass)) {
return true;
}
return false;
});
if (refPathMatch) {
refPaths.push(file);
}
});
const paths = refPaths.map(refP => ({ from: refP, to: targetPath }));
// Add all paths with project prefix as well.
const projectPrefixedPaths = paths.map(({ from, to }) => {
return { from: `${refProjectName}/${from.slice(2)}`, to };
});
return [...paths, ...projectPrefixedPaths];
/* eslint-enable arrow-body-style */
}
/**
*
* @param {FindCustomelementsAnalyzerResult} targetFindCustomelementsResult
* @param {FindCustomelementsAnalyzerResult} refFindCustomelementsResult
* @param {FindExportsAnalyzerResult} refFindExportsResult
* @param {string} targetMatchedFile file where `toClass` from match-subclasses is defined
* @param {string} toClass Identifier exported by target project, for instance `WolfCheckbox`
* @param {MatchSubclassEntry} matchSubclassEntry
*/
function getTagPaths(
targetFindCustomelementsResult,
refFindCustomelementsResult,
refFindExportsResult,
targetMatchedFile,
toClass,
matchSubclassEntry,
) {
/* eslint-disable arrow-body-style */
let targetResult;
targetFindCustomelementsResult.queryOutput.some(({ file, result }) => {
const targetPathMatch = result.find(entry => {
const sameRoot = entry.rootFile.file === targetMatchedFile;
const sameIdentifier = entry.rootFile.specifier === toClass;
return sameRoot && sameIdentifier;
});
if (targetPathMatch) {
targetResult = { file, tagName: targetPathMatch.tagName };
return true;
}
return false;
});
let refResult;
refFindCustomelementsResult.queryOutput.some(({ file, result }) => {
const refPathMatch = result.find(entry => {
const matchSubclassSpecifierRootFile = getExportSpecifierRootFile(
matchSubclassEntry.exportSpecifier,
refFindExportsResult,
);
if (!matchSubclassSpecifierRootFile) {
return false;
}
const sameRoot = entry.rootFile.file === matchSubclassSpecifierRootFile.file;
const sameIdentifier = entry.rootFile.specifier === matchSubclassEntry.exportSpecifier.name;
return sameRoot && sameIdentifier;
});
if (refPathMatch) {
refResult = { file, tagName: refPathMatch.tagName };
return true;
}
return false;
});
return { targetResult, refResult };
/* eslint-enable arrow-body-style */
}
/**
* @param {MatchSubclassesAnalyzerResult} targetMatchSubclassesResult
* @param {FindExportsAnalyzerResult} targetExportsResult
* @param {FindCustomelementsAnalyzerResult} targetFindCustomelementsResult
* @param {FindCustomelementsAnalyzerResult} refFindCustomelementsResult
* @param {FindExportsAnalyzerResult} refFindExportsResult
* @returns {AnalyzerResult}
*/
function matchPathsPostprocess(
targetMatchSubclassesResult,
targetExportsResult,
targetFindCustomelementsResult,
refFindCustomelementsResult,
refFindExportsResult,
refProjectName,
) {
/** @type {AnalyzerResult} */
const resultsArray = [];
targetMatchSubclassesResult.queryOutput.forEach(matchSubclassEntry => {
const fromClass = matchSubclassEntry.exportSpecifier.name;
matchSubclassEntry.matchesPerProject.forEach(projectMatch => {
projectMatch.files.forEach(({ identifier: toClass, file: targetMatchedFile }) => {
const resultEntry = {
name: fromClass,
};
// [A] Get variable paths
const paths = getVariablePaths(
targetExportsResult,
refFindExportsResult,
targetMatchedFile,
fromClass,
toClass,
refProjectName,
);
if (paths && paths.length) {
resultEntry.variable = {
from: fromClass,
to: toClass,
paths,
};
}
// [B] Get tag paths
const { targetResult, refResult } = getTagPaths(
targetFindCustomelementsResult,
refFindCustomelementsResult,
refFindExportsResult,
targetMatchedFile,
toClass,
matchSubclassEntry,
);
if (refResult && targetResult) {
resultEntry.tag = {
from: refResult.tagName,
to: targetResult.tagName,
paths: [
{ from: refResult.file, to: targetResult.file },
{ from: `${refProjectName}/${refResult.file.slice(2)}`, to: targetResult.file },
],
};
}
if (resultEntry.variable || resultEntry.tag) {
resultsArray.push(resultEntry);
}
});
});
});
return resultsArray;
}
/**
* Designed to work in conjunction with npm package `extend-docs`.
* It will lookup all class exports from reference project A (and store their available paths) and
* matches them against all imports of project B that extend exported class (and store their
* available paths).
*
* @example
* [
* ...
* {
* name: 'LionCheckbox',
* variable: {
* from: 'LionCheckbox',
* to: 'WolfCheckbox',
* paths: [
* { from: './index.js', to: './index.js' },
* { from: './src/LionCheckbox.js', to: './index.js' },
* { from: '@lion/checkbox-group', to: './index.js' },
* { from: '@lion/checkbox-group/src/LionCheckbox.js', to: './index.js' },
* ],
* },
* tag: {
* from: 'lion-checkbox',
* to: 'wolf-checkbox',
* paths: [
* { from: './lion-checkbox.js', to: './wolf-checkbox.js' },
* { from: '@lion/checkbox-group/lion-checkbox.js', to: './wolf-checkbox.js' },
* ],
* }
* },
* ...
* ]
*/
class MatchPathsAnalyzer extends Analyzer {
constructor() {
super();
this.name = 'match-paths';
}
static get requiresReference() {
return true;
}
/**
* @param {MatchClasspathsConfig} customConfig
*/
async execute(customConfig = {}) {
/**
* @typedef MatchClasspathsConfig
* @property {GatherFilesConfig} [gatherFilesConfig]
* @property {GatherFilesConfig} [gatherFilesConfigReference]
* @property {string} [referenceProjectPath] reference path
* @property {string} [targetProjectPath] search target path
* @property {{ from: string, to: string }} [prefix]
*/
const cfg = {
gatherFilesConfig: {},
gatherFilesConfigReference: {},
referenceProjectPath: null,
targetProjectPath: null,
prefix: null,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* ## Goal A: variable
* Automatically generate a mapping from lion docs import paths to extension layer
* import paths. To be served to extend-docs
*
* ## Traversal steps
*
* [A1] Find path variable.to 'WolfCheckbox'
* Run 'match-subclasses' for target project: we find the 'rootFilePath' of class definition,
* which will be matched against the rootFiles found in [A2]
* Result: './packages/wolf-checkbox/WolfCheckbox.js'
* [A2] Find root export path under which 'WolfCheckbox' is exported
* Run 'find-exports' on target: we find all paths like ['./index.js', './src/WolfCheckbox.js']
* Result: './index.js'
* [A3] Find all exports of LionCheckbox
* Run 'find-exports' for reference project
* Result: ['./src/LionCheckbox.js', './index.js']
* [A4] Match data and create a result object "variable"
*/
// [A1]
const targetMatchSubclassesAnalyzer = new MatchSubclassesAnalyzer();
/** @type {MatchSubclassesAnalyzerResult} */
const targetMatchSubclassesResult = await targetMatchSubclassesAnalyzer.execute({
targetProjectPath: cfg.targetProjectPath,
referenceProjectPath: cfg.referenceProjectPath,
gatherFilesConfig: cfg.gatherFilesConfigReference,
});
// [A2]
const targetFindExportsAnalyzer = new FindExportsAnalyzer();
/** @type {FindExportsAnalyzerResult} */
const targetExportsResult = await targetFindExportsAnalyzer.execute({
targetProjectPath: cfg.targetProjectPath,
});
// [A3]
const refFindExportsAnalyzer = new FindExportsAnalyzer();
/** @type {FindExportsAnalyzerResult} */
const refFindExportsResult = await refFindExportsAnalyzer.execute({
targetProjectPath: cfg.referenceProjectPath,
});
/**
* ## Goal B: tag
* Automatically generate a mapping from lion docs import paths to extension layer
* import paths. To be served to extend-docs
*
* [1] Find path variable.to 'WolfCheckbox'
* Run 'match-subclasses' for target project: we find the 'rootFilePath' of class definition,
* Result: './packages/wolf-checkbox/WolfCheckbox.js'
* [B1] Find export path of 'wolf-checkbox'
* Run 'find-customelements' on target project and match rootFile of [A1] with rootFile of
* constructor.
* Result: './wolf-checkbox.js'
* [B2] Find export path of 'lion-checkbox'
* Run 'find-customelements' and find-exports (for rootpath) on reference project and match
* rootFile of constructor with rootFiles of where LionCheckbox is defined.
* Result: './packages/checkbox/lion-checkbox.js',
* [B4] Match data and create a result object "tag"
*/
// [B1]
const targetFindCustomelementsAnalyzer = new FindCustomelementsAnalyzer();
/** @type {FindCustomelementsAnalyzerResult} */
const targetFindCustomelementsResult = await targetFindCustomelementsAnalyzer.execute({
targetProjectPath: cfg.targetProjectPath,
});
// [B2]
const refFindCustomelementsAnalyzer = new FindCustomelementsAnalyzer();
/** @type {FindCustomelementsAnalyzerResult} */
const refFindCustomelementsResult = await refFindCustomelementsAnalyzer.execute({
targetProjectPath: cfg.referenceProjectPath,
});
// refFindExportsAnalyzer was already created in A3
// Use one of the reference analyzer instances to get the project name
const refProjectName = refFindExportsAnalyzer.targetProjectMeta.name;
let queryOutput = matchPathsPostprocess(
targetMatchSubclassesResult,
targetExportsResult,
// refImportsResult,
targetFindCustomelementsResult,
refFindCustomelementsResult,
refFindExportsResult,
refProjectName,
);
if (cfg.prefix) {
queryOutput = filterPrefixMatches(queryOutput, cfg.prefix);
}
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = MatchPathsAnalyzer;

View file

@ -0,0 +1,330 @@
/* eslint-disable no-shadow, no-param-reassign */
const FindClassesAnalyzer = require('./find-classes.js');
const FindExportsAnalyzer = require('./find-exports.js');
const { Analyzer } = require('./helpers/Analyzer.js');
const { fromImportToExportPerspective } = require('./helpers/from-import-to-export-perspective.js');
/** @typedef {import('./types').FindClassesAnalyzerResult} FindClassesAnalyzerResult */
/** @typedef {import('./types').FindExportsAnalyzerResult} FindExportsAnalyzerResult */
function getMemberOverrides(
refClassesAResult,
classMatch,
exportEntry,
exportEntryResult,
exportSpecifier,
) {
if (!classMatch.members) return;
const { rootFile } = exportEntryResult.rootFileMap.find(
m => m.currentFileSpecifier === exportSpecifier,
);
const classFile = rootFile.file === '[current]' ? exportEntry.file : rootFile.file;
// check which methods are overridden as well...?
const entry = refClassesAResult.queryOutput.find(classEntry => classEntry.file === classFile);
if (!entry) {
// TODO: we should look in an external project for our classFile definition
return;
}
const originalClass = entry.result.find(({ name }) => name === classMatch.rootFile.specifier);
const methods = classMatch.members.methods.filter(m =>
originalClass.members.methods.find(({ name }) => name === m.name),
);
const props = classMatch.members.methods.filter(m =>
originalClass.members.methods.find(({ name }) => name === m.name),
);
// eslint-disable-next-line consistent-return
return { methods, props };
}
/**
* @desc Helper method for matchImportsPostprocess. Modifies its resultsObj
* @param {object} resultsObj
* @param {string} exportId like 'myExport::./reference-project/my/export.js::my-project'
* @param {Set<string>} filteredList
*/
function storeResult(resultsObj, exportId, filteredList, meta) {
if (!resultsObj[exportId]) {
// eslint-disable-next-line no-param-reassign
resultsObj[exportId] = { meta };
}
// eslint-disable-next-line no-param-reassign
resultsObj[exportId].files = [...(resultsObj[exportId].files || []), ...Array.from(filteredList)];
}
/**
* @param {FindExportsAnalyzerResult} exportsAnalyzerResult
* @param {FindClassesAnalyzerResult} targetClassesAnalyzerResult
* @param {FindClassesAnalyzerResult} refClassesAResult
* @param {MatchSubclassesConfig} customConfig
* @returns {AnalyzerResult}
*/
function matchSubclassesPostprocess(
exportsAnalyzerResult,
targetClassesAnalyzerResult,
refClassesAResult,
customConfig,
) {
const cfg = {
...customConfig,
};
/**
* Step 1: a 'flat' data structure
* @desc Create a key value storage map for exports/class matches
* - key: `${exportSpecifier}::${normalizedSource}::${project}` from reference project
* - value: an array of import file matches like `${targetProject}::${normalizedSource}::${className}`
* @example
* {
* 'LionDialog::./reference-project/my/export.js::my-project' : {
* meta: {...},
* files: [
* 'target-project-a::./import/file.js::MyDialog',
* 'target-project-b::./another/import/file.js::MyOtherDialog'
* ],
* ]}
* }
*/
const resultsObj = {};
exportsAnalyzerResult.queryOutput.forEach(exportEntry => {
const exportsProjectObj = exportsAnalyzerResult.analyzerMeta.targetProject;
const exportsProjectName = exportsProjectObj.name;
// Look for all specifiers that are exported, like [import {specifier} 'lion-based-ui/foo.js']
exportEntry.result.forEach(exportEntryResult => {
if (!exportEntryResult.exportSpecifiers) {
return;
}
exportEntryResult.exportSpecifiers.forEach(exportSpecifier => {
// Get all unique imports (name::source::project combinations) that match current
// exportSpecifier
const filteredImportsList = new Set();
const exportId = `${exportSpecifier}::${exportEntry.file}::${exportsProjectName}`;
// eslint-disable-next-line no-shadow
const importProject = targetClassesAnalyzerResult.analyzerMeta.targetProject.name;
targetClassesAnalyzerResult.queryOutput.forEach(({ result, file }) =>
result.forEach(classEntryResult => {
/**
* @example
* Example context (read by 'find-classes'/'find-exports' analyzers)
* - export (/folder/exporting-file.js):
* `export class X {}`
* - import (target-project-a/importing-file.js):
* `import { X } from '@reference-repo/folder/exporting-file.js'
*
* class Z extends Mixin(X) {}
* `
* Example variables (extracted by 'find-classes'/'find-exports' analyzers)
* - exportSpecifier: 'X'
* - superClasses: [{ name: 'X', ...}, { name: 'Mixin', ...}]
*/
const classMatch =
// [{ name: 'X', ...}, ...].find(klass => klass.name === 'X')
classEntryResult.superClasses &&
classEntryResult.superClasses.find(
klass => klass.rootFile.specifier === exportSpecifier,
);
// console.log('classEntryResult', classEntryResult);
if (!classMatch) {
return;
}
// console.log(exportSpecifier, classEntryResult.superClasses && classEntryResult.superClasses.map(k => k.rootFile.specifier));
/**
* @example
* - project "reference-project"
* - exportFile './foo.js'
* `export const z = 'bar'`
* - project "target-project"
* - importFile './file.js'
* `import { z } from 'reference-project/foo.js'`
*/
const isFromSameSource =
exportEntry.file ===
fromImportToExportPerspective({
requestedExternalSource: classMatch.rootFile.file,
externalProjectMeta: exportsProjectObj,
externalRootPath: cfg.referenceProjectPath,
});
if (classMatch && isFromSameSource) {
const memberOverrides = getMemberOverrides(
refClassesAResult,
classMatch,
exportEntry,
exportEntryResult,
exportSpecifier,
);
filteredImportsList.add({
projectFileId: `${importProject}::${file}::${classEntryResult.name}`,
memberOverrides,
});
}
}),
);
storeResult(resultsObj, exportId, filteredImportsList, exportEntry.meta);
});
});
});
/**
* Step 2: a rich data structure
* @desc Transform resultObj from step 1 into an array of objects
* @example
* [{
* exportSpecifier: {
* // name under which it is registered in npm ("name" attr in package.json)
* name: 'RefClass',
* project: 'exporting-ref-project',
* filePath: './ref-src/core.js',
* id: 'RefClass::ref-src/core.js::exporting-ref-project',
* meta: {...},
*
* // most likely via post processor
* },
* // All the matched targets (files importing the specifier), ordered per project
* matchesPerProject: [
* {
* project: 'importing-target-project',
* files: [
* { file:'./target-src/indirect-imports.js', className: 'X'},
* ...
* ],
* },
* ...
* ],
* }]
*/
const resultsArray = Object.entries(resultsObj)
.map(([id, flatResult]) => {
const [exportSpecifierName, filePath, project] = id.split('::');
const { meta } = flatResult;
const exportSpecifier = {
name: exportSpecifierName,
project,
filePath,
id,
...(meta || {}),
};
// Although we only handle 1 target project, this structure (matchesPerProject, assuming we
// deal with multiple target projects)
// allows for easy aggregation of data in dashboard.
const matchesPerProject = [];
flatResult.files.forEach(({ projectFileId, memberOverrides }) => {
// eslint-disable-next-line no-shadow
const [project, file, identifier] = projectFileId.split('::');
let projectEntry = matchesPerProject.find(m => m.project === project);
if (!projectEntry) {
matchesPerProject.push({ project, files: [] });
projectEntry = matchesPerProject[matchesPerProject.length - 1];
}
projectEntry.files.push({ file, identifier, memberOverrides });
});
return {
exportSpecifier,
matchesPerProject,
};
})
.filter(r => Object.keys(r.matchesPerProject).length);
return /** @type {AnalyzerResult} */ resultsArray;
}
// function postProcessAnalyzerResult(aResult) {
// // Don't bloat the analyzerResult with the outputs (just the configurations) of other analyzers
// // delete aResult.analyzerMeta.configuration.targetClassesAnalyzerResult.queryOutput;
// // delete aResult.analyzerMeta.configuration.exportsAnalyzerResult.queryOutput;
// return aResult;
// }
class MatchSubclassesAnalyzer extends Analyzer {
constructor() {
super();
this.name = 'match-subclasses';
}
static get requiresReference() {
return true;
}
/**
* @desc Based on ExportsAnalyzerResult of reference project(s) (for instance lion-based-ui)
* and targetClassesAnalyzerResult of search-targets (for instance my-app-using-lion-based-ui),
* an overview is returned of all matching imports and exports.
* @param {MatchSubclassesConfig} customConfig
*/
async execute(customConfig = {}) {
/**
* @typedef MatchSubclassesConfig
* @property {FindExportsConfig} [exportsConfig] These will be used when no exportsAnalyzerResult
* is provided (recommended way)
* @property {FindClassesConfig} [findClassesConfig]
* @property {GatherFilesConfig} [gatherFilesConfig]
* @property {GatherFilesConfig} [gatherFilesConfigReference]
* @property {array} [referenceProjectPath] reference paths
* @property {array} [targetProjectPath] search target paths
*/
const cfg = {
gatherFilesConfig: {},
gatherFilesConfigReference: {},
referenceProjectPath: null,
targetProjectPath: null,
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
const findExportsAnalyzer = new FindExportsAnalyzer();
/** @type {FindExportsAnalyzerResult} */
const exportsAnalyzerResult = await findExportsAnalyzer.execute({
targetProjectPath: cfg.referenceProjectPath,
gatherFilesConfig: cfg.gatherFilesConfigReference,
});
const findClassesAnalyzer = new FindClassesAnalyzer();
/** @type {FindClassesAnalyzerResult} */
const targetClassesAnalyzerResult = await findClassesAnalyzer.execute({
targetProjectPath: cfg.targetProjectPath,
});
const findRefClassesAnalyzer = new FindClassesAnalyzer();
/** @type {FindClassesAnalyzerResult} */
const refClassesAnalyzerResult = await findRefClassesAnalyzer.execute({
targetProjectPath: cfg.referenceProjectPath,
gatherFilesConfig: cfg.gatherFilesConfigReference,
});
const queryOutput = matchSubclassesPostprocess(
exportsAnalyzerResult,
targetClassesAnalyzerResult,
refClassesAnalyzerResult,
cfg,
);
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = MatchSubclassesAnalyzer;

View file

@ -0,0 +1,87 @@
const pathLib = require('path');
const { LogService } = require('../../services/LogService.js');
const /** @type {AnalyzerOptions} */ options = {
filterSpecifier(results, targetSpecifier, specifiersKey) {
return results.filter(entry => entry[specifiersKey] === targetSpecifier);
},
};
/**
*
* @param {AnalyzerResult} analyzerResult
* @param {FindImportsConfig} customConfig
* @returns {AnalyzerResult}
*/
function sortBySpecifier(analyzerResult, customConfig) {
const cfg = {
filterSpecifier: '',
specifiersKey: 'importSpecifiers', // override to make compatible with exportSpecifiers
...customConfig,
};
if (customConfig && customConfig.keepOriginalSourcePaths) {
LogService.error(
'[ post-processor "sort-by-specifier" ]: Please provide a QueryResult without "keepOriginalSourcePaths" configured',
);
process.exit(1);
}
const resultsObj = {};
analyzerResult.forEach(({ entries, project }) => {
const projectName = project.name;
entries.forEach(entry => {
entry.result.forEach(resultForEntry => {
const { normalizedSource, source } = resultForEntry;
const specifiers = resultForEntry[cfg.specifiersKey];
specifiers.forEach(s => {
const uniqueKey = `${s}::${normalizedSource || source}`;
const filePath = pathLib.resolve('/', projectName, entry.file).replace(/^\//, '');
if (resultsObj[uniqueKey]) {
resultsObj[uniqueKey] = [...resultsObj[uniqueKey], filePath];
} else {
resultsObj[uniqueKey] = [filePath];
}
});
});
});
});
/**
* Now transform into this format:
* "specifier": "LitElement",
* "source": "lion-based-ui/core.js",
* "id": "LitElement::lion-based-ui/core.js",
* "dependents": [
* "my-app/src/content-template.js",
* "my-app/src/table/data-table.js",
*/
let resultsBySpecifier = Object.entries(resultsObj).map(([id, dependents]) => {
const [specifier, source] = id.split('::');
return {
specifier,
source,
id,
dependents,
};
});
if (cfg.filterSpecifier) {
resultsBySpecifier = options.filterSpecifier(
resultsBySpecifier,
cfg.filterSpecifier,
cfg.specifiersKey,
);
}
return /** @type {AnalyzerResult} */ resultsBySpecifier;
}
module.exports = {
name: 'sort-by-specifier',
execute: sortBySpecifier,
compatibleAnalyzers: ['find-imports', 'find-exports'],
// This means it transforms the result output of an analyzer, and multiple
// post processors cannot be chained after this one
modifiesOutputStructure: true,
};

View file

@ -0,0 +1,309 @@
import { ClassMethod } from "@babel/types";
import { ProjectReference } from "typescript";
export interface RootFile {
/** the file path containing declaration, for instance './target-src/direct-imports.js'. Can also contain keyword '[current]' */
file: string;
/** the specifier/identifier that was exported in root file, for instance 'MyClass' */
specifier: string;
}
export interface AnalyzerResult {
/** meta info object */
meta: Meta;
/** array of AST traversal output, per project file */
queryOutput: AnalyzerOutputFile[];
}
export interface AnalyzerOutputFile {
/** path relative from project root for which a result is generated based on AST traversal */
file: string;
/** result of AST traversal for file in project */
result: array;
}
// TODO: make sure that data structures of JSON output (generated in ReportService)
// and data structure generated in Analyzer.prototype._finalize match exactly (move logic from ReportSerivce to _finalize)
// so that these type definitions can be used to generate a json schema: https://www.npmjs.com/package/typescript-json-schema
interface Meta {
/** type of the query. Currently onlu "ast-analyzer" supported */
searchType: string;
/** analyzer meta object */
analyzerMeta: AnalyzerMeta;
}
export interface AnalyzerMeta {
/** analizer name like 'find-imports' or 'match-sublcasses' */
name: string;
/** the ast format. Currently only 'babel' */
requiredAst: string;
/** a unique hash based on target, reference and configuration */
identifier: string;
/** target project meta object */
targetProject: Project;
/** reference project meta object */
referenceProject?: Project;
/** the configuration used for this particular analyzer run */
configuration: object;
}
export interface Project {
/** "name" found in package.json and under which the package is registered in npm */
name: string;
/** "version" found in package.json */
version: string;
/** "main" File found in package.json */
mainFile: string;
/** if a git repo is analyzed, stores commit hash, [not-a-git-repo] if not */
commitHash: string;
}
// match-customelements
export interface MatchSubclassesAnalyzerResult extends AnalyzerResult {
queryOutput: MatchSubclassesAnalyzerOutputEntry[];
}
export interface MatchSubclassesAnalyzerOutputEntry {
exportSpecifier: MatchedExportSpecifier;
matchesPerProject: MatchSubclassesAnalyzerOutputEntryMatch[];
}
export interface MatchSubclassesAnalyzerOutputEntryMatch {
/** The target project that extends the class exported by reference project */
project: string;
/** Array of meta objects for matching files */
files: MatchSubclassesAnalyzerOutputEntryMatchFile[];
}
export interface MatchSubclassesAnalyzerOutputEntryMatchFile {
/**
* The local filepath that contains the matched class inside the target project
* like `./src/ExtendedClass.js`
*/
file: string;
/**
* The local Identifier inside matched file that is exported
* @example
* - `ExtendedClass` for `export ExtendedClass extends RefClass {};`
* - `[default]` for `export default ExtendedClass extends RefClass {};`
*/
identifier: string;
}
export interface MatchedExportSpecifier extends AnalyzerResult {
/** The exported Identifier name.
*
* For instance
* - `export { X as Y } from 'q'` => `Y`
* - `export default class Z {}` => `[default]`
*/
name: string;
/** Project name as found in package.json */
project: string;
/** Path relative from project root, for instance `./index.js` */
filePath: string;
/** "[default]::./index.js::exporting-ref-project" */
id: string;
}
// "find-customelements"
export interface FindCustomelementsAnalyzerResult extends AnalyzerResult {
queryOutput: FindCustomelementsAnalyzerOutputFile[];
}
export interface FindCustomelementsAnalyzerOutputFile extends AnalyzerOutputFile {
/** path relative from project root for which a result is generated based on AST traversal */
file: string;
/** result of AST traversal for file in project */
result: FindCustomelementsAnalyzerEntry[];
}
export interface FindCustomelementsAnalyzerEntry {
/**
* Tag name found in CE definition:
* `customElements.define('my-name', MyConstructor)` => 'my-name'
*/
tagName: string;
/**
* Identifier found in CE definition:
* `customElements.define('my-name', MyConstructor)` => MyConstructor
*/
constructorIdentifier: string;
/** Rootfile traced for constuctorIdentifier found in CE definition */
rootFile: RootFile;
}
// "find-exports"
export interface FindExportsAnalyzerResult extends AnalyzerResult {
queryOutput: FindExportsAnalyzerOutputFile[];
}
export interface FindExportsAnalyzerOutputFile extends AnalyzerOutputFile {
/** path relative from project root for which a result is generated based on AST traversal */
file: string;
/** result of AST traversal for file in project */
result: FindExportsAnalyzerEntry[];
}
export interface FindExportsAnalyzerEntry {
/**
* The specifiers found in an export statement.
*
* For example:
* - file `export class X {}` gives `['X']`
* - file `export default const y = 0` gives `['[default]']`
* - file `export { y, z } from 'project'` gives `['y', 'z']`
*/
exportSpecifiers: string[];
/**
* The original "source" string belonging to specifier.
* For example:
* - file `export { x } from './my/file';` gives `"./my/file"`
* - file `export { x } from 'project';` gives `"project"`
*/
source: string;
/**
* The normalized "source" string belonging to specifier
* (based on file system information, resolves right names and extensions).
* For example:
* - file `export { x } from './my/file';` gives `"./my/file.js"`
* - file `export { x } from 'project';` gives `"project"` (only files in current project are resolved)
* - file `export { x } from '../';` gives `"../index.js"`
*/
normalizedSource: string;
/** map of tracked down Identifiers */
rootFileMap: RootFileMapEntry[];
}
export interface RootFileMapEntry {
/** This is the local name in the file we track from */
currentFileSpecifier: string;
/**
* The file that contains the original declaration of a certain Identifier/Specifier.
* Contains file(filePath) and specifier keys
*/
rootFile: RootFile;
}
// "find-imports"
export interface FindImportsAnalyzerResult extends AnalyzerResult {
queryOutput: FindImportsAnalyzerOutputFile[];
}
export interface FindImportsAnalyzerOutputFile extends AnalyzerOutputFile {
/** path relative from project root for which a result is generated based on AST traversal */
file: string;
/** result of AST traversal for file in project */
result: FindImportsAnalyzerEntry[];
}
export interface FindImportsAnalyzerEntry {
/**
* The specifiers found in an import statement.
*
* For example:
* - file `import { X } from 'project'` gives `['X']`
* - file `import X from 'project'` gives `['[default]']`
* - file `import x, { y, z } from 'project'` gives `['[default]', 'y', 'z']`
*/
importSpecifiers: string[];
/**
* The original "source" string belonging to specifier.
* For example:
* - file `import { x } from './my/file';` gives `"./my/file"`
* - file `import { x } from 'project';` gives `"project"`
*/
source: string;
/**
* The normalized "source" string belonging to specifier
* (based on file system information, resolves right names and extensions).
* For example:
* - file `import { x } from './my/file';` gives `"./my/file.js"`
* - file `import { x } from 'project';` gives `"project"` (only files in current project are resolved)
* - file `import { x } from '../';` gives `"../index.js"`
*/
normalizedSource: string;
}
// "find-classes"
export interface FindClassesAnalyzerResult extends AnalyzerResult {
queryOutput: FindClassesAnalyzerOutputFile[];
}
export interface FindClassesAnalyzerOutputFile extends AnalyzerOutputFile {
/** path relative from project root for which a result is generated based on AST traversal */
file: string;
/** result of AST traversal for file in project */
result: FindClassesAnalyzerEntry[];
}
export interface FindClassesAnalyzerEntry {
/** the name of the class */
name: string;
/** whether the class is a mixin function */
isMixin: boolean;
/** super classes and mixins */
superClasses: SuperClass[];
members: ClassMember;
}
interface ClassMember {
props: ClassProperty;
methods: ClassMethod;
}
interface ClassProperty {
/** class property name */
name: string;
/** 'public', 'protected' or 'private' */
accessType: string;
/** can be 'get', 'set' or both */
kind: Array;
/** whether property is static */
static: boolean;
}
interface ClassMethod {
/** class method name */
name: string;
/** 'public', 'protected' or 'private' */
accessType: boolean;
}
export interface SuperClass {
/** the name of the super class */
name: string;
/** whether the superClass is a mixin function */
isMixin: boolean;
rootFile: RootFile;
}
export interface FindClassesConfig {
/** search target paths */
targetProjectPath: string;
}
export interface AnalyzerConfig {
/** search target project path */
targetProjectPath: string;
gatherFilesConfig: GatherFilesConfig;
}
export interface MatchAnalyzerConfig extends AnalyzerConfig {
/** reference project path, used to match reference against target */
referenceProjectPath: string;
}

View file

@ -0,0 +1,221 @@
const deepmerge = require('deepmerge');
const { ReportService } = require('./services/ReportService.js');
const { InputDataService } = require('./services/InputDataService.js');
const { LogService } = require('./services/LogService.js');
const { QueryService } = require('./services/QueryService.js');
const { aForEach } = require('./utils/async-array-utils.js');
// After handling a combo, we should know which project versions we have, since
// the analyzer internally called createDataObject(which provides us the needed meta info).
function addToSearchTargetDepsFile({ queryResult, queryConfig, providenceConfig }) {
const currentSearchTarget = queryConfig.analyzerConfig.targetProjectPath;
// eslint-disable-next-line array-callback-return, consistent-return
providenceConfig.targetProjectRootPaths.some(rootRepo => {
const rootProjectMeta = InputDataService.getProjectMeta(rootRepo);
if (currentSearchTarget.startsWith(rootRepo)) {
const { name: depName, version: depVersion } = queryResult.meta.analyzerMeta.targetProject;
// TODO: get version of root project as well. For now, we're good with just the name
// const rootProj = pathLib.basename(rootRepo);
const depProj = `${depName}#${depVersion}`;
// Write to file... TODO: add to array first
ReportService.writeEntryToSearchTargetDepsFile(depProj, rootProjectMeta);
return true;
}
});
}
function report(queryResult, cfg) {
if (cfg.report && !queryResult.meta.analyzerMeta.__fromCache) {
const { identifier } = queryResult.meta.analyzerMeta;
ReportService.writeToJson(queryResult, identifier, cfg.outputPath);
}
}
/**
* @desc creates unique QueryConfig for analyzer turn
* @param {QueryConfig} queryConfig
* @param {string} targetProjectPath
* @param {string} referenceProjectPath
*/
function getSlicedQueryConfig(queryConfig, targetProjectPath, referenceProjectPath) {
return {
...queryConfig,
...{
analyzerConfig: {
...queryConfig.analyzerConfig,
...{
...(referenceProjectPath ? { referenceProjectPath } : {}),
targetProjectPath,
},
},
},
};
}
/**
* @desc definition "projectCombo": referenceProject#version + searchTargetProject#version
* @param {QueryConfig} slicedQConfig
* @param {cfg} object
*/
async function handleAnalyzerForProjectCombo(slicedQConfig, cfg) {
const queryResult = await QueryService.astSearch(slicedQConfig, {
gatherFilesConfig: cfg.gatherFilesConfig,
gatherFilesConfigReference: cfg.gatherFilesConfigReference,
...slicedQConfig.analyzerConfig,
});
if (queryResult) {
report(queryResult, cfg);
}
return queryResult;
}
/**
* @desc Here, we will match all our reference projects (exports) against all our search targets
* (imports).
*
* This is an expensive operation. Therefore, we allow caching.
* For each project, we store 'commitHash' and 'version' meta data.
* For each combination of referenceProject#version and searchTargetProject#version we
* will create a json output file.
* For its filename, it will create a hash based on referenceProject#version +
* searchTargetProject#version + cfg of analyzer.
* Whenever the generated hash already exists in previously stored query results,
* we don't have to regenerate it.
*
* All the json outputs can be aggregated in our dashboard and visually presented in
* various ways.
*
* @param {QueryConfig} queryConfig
* @param {ProvidenceConfig} cfg
*/
async function handleAnalyzer(queryConfig, cfg) {
const queryResults = [];
const { referenceProjectPaths, targetProjectPaths } = cfg;
await aForEach(targetProjectPaths, async searchTargetProject => {
if (referenceProjectPaths) {
await aForEach(referenceProjectPaths, async ref => {
// Create shallow cfg copy with just currrent reference folder
const slicedQueryConfig = getSlicedQueryConfig(queryConfig, searchTargetProject, ref);
const queryResult = await handleAnalyzerForProjectCombo(slicedQueryConfig, cfg);
queryResults.push(queryResult);
if (cfg.targetProjectRootPaths) {
addToSearchTargetDepsFile({
queryResult,
queryConfig: slicedQueryConfig,
providenceConfig: cfg,
});
}
});
} else {
const slicedQueryConfig = getSlicedQueryConfig(queryConfig, searchTargetProject);
const queryResult = await handleAnalyzerForProjectCombo(slicedQueryConfig, cfg);
queryResults.push(queryResult);
if (cfg.targetProjectRootPaths) {
addToSearchTargetDepsFile({
queryResult,
queryConfig: slicedQueryConfig,
providenceConfig: cfg,
});
}
}
});
return queryResults;
}
async function handleFeature(queryConfig, cfg, inputData) {
if (cfg.queryMethod === 'grep') {
const queryResult = await QueryService.grepSearch(inputData, queryConfig, {
gatherFilesConfig: cfg.gatherFilesConfig,
gatherFilesConfigReference: cfg.gatherFilesConfigReference,
});
return queryResult;
}
return undefined;
}
async function handleRegexSearch(queryConfig, cfg, inputData) {
if (cfg.queryMethod === 'grep') {
const queryResult = await QueryService.grepSearch(inputData, queryConfig, {
gatherFilesConfig: cfg.gatherFilesConfig,
gatherFilesConfigReference: cfg.gatherFilesConfigReference,
});
return queryResult;
}
return undefined;
}
/**
* @desc Creates a report with usage metrics, based on a queryConfig.
*
* @param {QueryConfig} queryConfig a query configuration object containing analyzerOptions.
* @param {object} customConfig
* @param {'ast'|'grep'} customConfig.queryMethod whether analyzer should be run or a grep should
* be performed
* @param {string[]} customConfig.targetProjectPaths search target projects. For instance
* ['/path/to/app-a', '/path/to/app-b', ... '/path/to/app-z']
* @param {string[]} [customConfig.referenceProjectPaths] reference projects. Needed for 'match
* analyzers', having `requiresReference: true`. For instance
* ['/path/to/lib1', '/path/to/lib2']
* @param {GatherFilesConfig} [customConfig.gatherFilesConfig]
* @param {boolean} [customConfig.report]
* @param {boolean} [customConfig.debugEnabled]
*/
async function providenceMain(queryConfig, customConfig) {
const cfg = deepmerge(
{
queryMethod: 'grep',
// This is a merge of all 'main entry projects'
// found in search-targets, including their children
targetProjectPaths: null,
referenceProjectPaths: null,
// This will be needed to identify the parent/child relationship to write to
// {outputFolder}/entryProjectDependencies.json, which will map
// a project#version to [ depA#version, depB#version ]
targetProjectRootPaths: null,
gatherFilesConfig: {},
report: true,
debugEnabled: false,
writeLogFile: false,
},
customConfig,
);
if (cfg.debugEnabled) {
LogService.debugEnabled = true;
}
if (cfg.referenceProjectPaths) {
InputDataService.referenceProjectPaths = cfg.referenceProjectPaths;
}
let queryResults;
if (queryConfig.type === 'analyzer') {
queryResults = await handleAnalyzer(queryConfig, cfg);
} else {
const inputData = InputDataService.createDataObject(
cfg.targetProjectPaths,
cfg.gatherFilesConfig,
);
if (queryConfig.type === 'feature') {
queryResults = await handleFeature(queryConfig, cfg, inputData);
report(queryResults, cfg);
} else if (queryConfig.type === 'search') {
queryResults = await handleRegexSearch(queryConfig, cfg, inputData);
report(queryResults, cfg);
}
}
if (cfg.writeLogFile) {
LogService.writeLogFile();
}
return queryResults;
}
module.exports = {
providence: providenceMain,
};

View file

@ -0,0 +1,126 @@
const {
createProgram,
getPreEmitDiagnostics,
ModuleKind,
ModuleResolutionKind,
ScriptTarget,
} = require('typescript');
const babelParser = require('@babel/parser');
const esModuleLexer = require('es-module-lexer');
const parse5 = require('parse5');
const traverseHtml = require('../utils/traverse-html.js');
const { LogService } = require('./LogService.js');
class AstService {
/**
* @deprecated for simplicity/maintainability, only allow Babel for js
* Compiles an array of file paths using Typescript.
* @param {string[]} filePaths
* @param options
*/
static _getTypescriptAst(filePaths, options) {
// eslint-disable-next-line no-param-reassign
filePaths = Array.isArray(filePaths) ? filePaths : [filePaths];
const defaultOptions = {
noEmitOnError: false,
allowJs: true,
experimentalDecorators: true,
target: ScriptTarget.Latest,
downlevelIteration: true,
module: ModuleKind.ESNext,
// module: ModuleKind.CommonJS,
// lib: ["esnext", "dom"],
strictNullChecks: true,
moduleResolution: ModuleResolutionKind.NodeJs,
esModuleInterop: true,
noEmit: true,
allowSyntheticDefaultImports: true,
allowUnreachableCode: true,
allowUnusedLabels: true,
skipLibCheck: true,
isolatedModules: true,
};
const program = createProgram(filePaths, options || defaultOptions);
const diagnostics = getPreEmitDiagnostics(program);
const files = program.getSourceFiles().filter(sf => filePaths.includes(sf.fileName));
return { diagnostics, program, files };
}
/**
* Compiles an array of file paths using Babel.
* @param {string} code
* @param {object} [options]
*/
static _getBabelAst(code) {
const ast = babelParser.parse(code, {
sourceType: 'module',
plugins: ['importMeta', 'dynamicImport', 'classProperties'],
});
return ast;
}
/**
* @desc Combines all script tags as if it were one js file.
* @param {string} htmlCode
*/
static getScriptsFromHtml(htmlCode) {
const ast = parse5.parseFragment(htmlCode);
const scripts = [];
traverseHtml(ast, {
script(path) {
const code = path.node.childNodes[0] ? path.node.childNodes[0].value : '';
scripts.push(code);
},
});
return scripts;
}
/**
* @deprecated for simplicity/maintainability, only allow Babel for js
* @param {string} code
*/
static async _getEsModuleLexerOutput(code) {
return esModuleLexer.parse(code);
}
/**
* @desc Returns the desired AST
* Why would we support multiple ASTs/parsers?
* - 'babel' is our default tool for analysis. It's the most versatile and popular tool, it's
* close to the EStree standard (other than Typescript) and a lot of plugins and resources can
* be found online. It also allows to parse Typescript and spec proposals.
* - 'typescript' (deprecated) is needed for some valuable third party tooling, like web-component-analyzer
* - 'es-module-lexer' (deprecated) is needed for the dedicated task of finding module imports; it is way
* quicker than a full fledged AST parser
* @param { 'babel' } astType
* @param { object } [options]
* @param { string } [options.filePath] the path of the file we're trying to parse
*/
// eslint-disable-next-line consistent-return
static getAst(code, astType, { filePath } = {}) {
// eslint-disable-next-line default-case
try {
// eslint-disable-next-line default-case
switch (astType) {
case 'babel':
return this._getBabelAst(code);
case 'typescript':
LogService.warn(`
Please notice "typescript" support is deprecated.
For parsing javascript, "babel" is recommended.`);
return this._getTypescriptAst(code);
case 'es-module-lexer':
LogService.warn(`
Please notice "es-module-lexer" support is deprecated.
For parsing javascript, "babel" is recommended.`);
return this._getEsModuleLexerOutput(code);
}
} catch (e) {
LogService.error(`Error when parsing "${filePath}":/n${e}`);
}
}
}
module.exports = { AstService };

View file

@ -0,0 +1,297 @@
/* eslint-disable no-param-reassign */
// @ts-ignore-next-line
require('../types/index.js');
const fs = require('fs');
const pathLib = require('path');
const child_process = require('child_process'); // eslint-disable-line camelcase
const glob = require('glob');
const { LogService } = require('./LogService.js');
const { AstService } = require('./AstService.js');
const { getFilePathRelativeFromRoot } = require('../utils/get-file-path-relative-from-root.js');
/**
*
* @param {string|array} v
* @returns {array}
*/
function ensureArray(v) {
return Array.isArray(v) ? v : [v];
}
function multiGlobSync(patterns, { keepDirs = false } = {}) {
patterns = ensureArray(patterns);
const res = new Set();
patterns.forEach(pattern => {
const files = glob.sync(pattern);
files.forEach(filePath => {
if (fs.lstatSync(filePath).isDirectory() && !keepDirs) {
return;
}
res.add(filePath);
});
});
return Array.from(res);
}
const defaultGatherFilesConfig = {
extensions: ['.js'],
excludeFiles: [],
excludeFolders: ['node_modules', 'bower_components'],
includePaths: [],
depth: Infinity,
};
/**
* @typedef {Object} ProjectData
* @property {string} project project name
* @property {string} path full path to project folder
* @property {string[]} entries all file paths within project folder
*/
/**
* To be used in main program.
* It creates an instance on which the 'files' array is stored.
* The files array contains all projects.
*
* Also serves as SSOT in many other contexts wrt data locations and gathering
*/
class InputDataService {
/**
* @desc create an array of ProjectData
* @param {string[]} projectPaths
* @param {GatherFilesConfig} gatherFilesConfig
* @returns {ProjectData}
*/
static createDataObject(projectPaths, gatherFilesConfig = {}) {
const inputData = projectPaths.map(projectPath => ({
project: {
name: pathLib.basename(projectPath),
path: projectPath,
},
entries: this.gatherFilesFromDir(projectPath, {
...defaultGatherFilesConfig,
...gatherFilesConfig,
}),
}));
return this._addMetaToProjectsData(inputData);
}
/**
* From 'main/file.js' or '/main/file.js' to './main/file.js'
*/
static __normalizeMainEntry(mainEntry) {
if (mainEntry.startsWith('/')) {
return `.${mainEntry}`;
}
if (!mainEntry.startsWith('.')) {
return `./${mainEntry}`;
}
return mainEntry;
}
/**
* @param {string} projectPath
*/
static getProjectMeta(projectPath) {
const project = { path: projectPath };
// Add project meta info
try {
const file = pathLib.resolve(projectPath, 'package.json');
const pkgJson = JSON.parse(fs.readFileSync(file, 'utf8'));
// eslint-disable-next-line no-param-reassign
project.mainEntry = this.__normalizeMainEntry(pkgJson.main || './index.js');
// eslint-disable-next-line no-param-reassign
project.name = pkgJson.name;
// TODO: also add meta info whether we are in a monorepo or not.
// We do this by checking whether there is a lerna.json on root level.
// eslint-disable-next-line no-empty
project.version = pkgJson.version;
} catch (e) {
LogService.warn(e);
}
project.commitHash = this._getCommitHash(projectPath);
return project;
}
static _getCommitHash(projectPath) {
let commitHash;
let isGitRepo;
try {
isGitRepo = fs.lstatSync(pathLib.resolve(projectPath, '.git')).isDirectory();
// eslint-disable-next-line no-empty
} catch (_) {}
if (isGitRepo) {
try {
const hash = child_process
.execSync('git rev-parse HEAD', {
cwd: projectPath,
})
.toString('utf-8')
.slice(0, -1);
// eslint-disable-next-line no-param-reassign
commitHash = hash;
} catch (e) {
LogService.warn(e);
}
} else {
commitHash = '[not-a-git-root]';
}
return commitHash;
}
/**
* @desc adds context with code (c.q. file contents), project name and project 'main' entry
* @param {InputData} inputData
*/
static _addMetaToProjectsData(inputData) {
return inputData.map(projectObj => {
// Add context obj with 'code' to files
const newEntries = [];
projectObj.entries.forEach(entry => {
const code = fs.readFileSync(entry, 'utf8');
const file = getFilePathRelativeFromRoot(entry, projectObj.project.path);
if (pathLib.extname(file) === '.html') {
const extractedScripts = AstService.getScriptsFromHtml(code);
// eslint-disable-next-line no-shadow
extractedScripts.forEach((code, i) => {
newEntries.push({ file: `${file}#${i}`, context: { code } });
});
} else {
newEntries.push({ file, context: { code } });
}
});
const project = this.getProjectMeta(projectObj.project.path);
return { project, entries: newEntries };
});
}
// TODO: rename to `get targetProjectPaths`
/**
* @desc gets all project directories/paths from './submodules'
* @returns {string[]} a list of strings representing all entry paths for projects we want to query
*/
static getTargetProjectPaths() {
if (this.__targetProjectPaths) {
return this.__targetProjectPaths;
}
const submoduleDir = pathLib.resolve(
__dirname,
'../../../providence-input-data/search-targets',
);
let dirs;
try {
dirs = fs.readdirSync(submoduleDir);
} catch (_) {
return [];
}
return dirs
.map(dir => pathLib.join(submoduleDir, dir))
.filter(dirPath => fs.lstatSync(dirPath).isDirectory());
}
static get referenceProjectPaths() {
if (this.__referenceProjectPaths) {
return this.__referenceProjectPaths;
}
let dirs;
try {
const referencesDir = pathLib.resolve(__dirname, '../../../providence-input-data/references');
dirs = fs.readdirSync(referencesDir);
dirs = dirs
.map(dir => pathLib.join(referencesDir, dir))
.filter(dirPath => fs.lstatSync(dirPath).isDirectory());
// eslint-disable-next-line no-empty
} catch (_) {}
return dirs;
}
static set referenceProjectPaths(v) {
this.__referenceProjectPaths = ensureArray(v);
}
static set targetProjectPaths(v) {
this.__targetProjectPaths = ensureArray(v);
}
static getDefaultGatherFilesConfig() {
return defaultGatherFilesConfig;
}
static getGlobPattern(startPath, cfg, withoutDepth = false) {
// if startPath ends with '/', remove
let globPattern = startPath.replace(/\/$/, '');
if (!withoutDepth) {
if (cfg.depth !== Infinity) {
globPattern += `/*`.repeat(cfg.depth + 1);
} else {
globPattern += `/**/*`;
}
}
return globPattern;
}
/**
* @desc Gets an array of files for given extension
* @param {string} startPath - local filesystem path
* @param {GatherFilesConfig} customConfig - configuration object
* @param {number} [customConfig.depth=Infinity] how many recursive calls should be made
* @param {string[]} [result] - list of file paths, for internal (recursive) calls
* @returns {string[]} result list of file paths
*/
static gatherFilesFromDir(startPath, customConfig) {
const cfg = {
...defaultGatherFilesConfig,
...customConfig,
};
let globPattern = this.getGlobPattern(startPath, cfg);
globPattern += `.{${cfg.extensions.map(e => e.slice(1)).join(',')},}`;
const globRes = multiGlobSync(globPattern);
const globPatternWithoutDepth = this.getGlobPattern(startPath, cfg, true);
let excludedGlobFiles;
if (cfg.exclude) {
excludedGlobFiles = multiGlobSync(`${globPatternWithoutDepth}/${cfg.exclude}`);
}
let filteredGlobRes = globRes.filter(gr => {
const localGr = gr.replace(startPath, '');
return (
!cfg.excludeFolders.some(f => localGr.includes(`${f}/`)) &&
!cfg.excludeFiles.some(f => localGr.includes(f)) &&
!(excludedGlobFiles && excludedGlobFiles.some(f => gr.includes(f)))
);
});
if (cfg.includePaths && cfg.includePaths.length) {
filteredGlobRes = globRes.filter(gr =>
cfg.includePaths.some(p => gr.startsWith(pathLib.resolve(startPath, p))),
);
}
if (!filteredGlobRes || !filteredGlobRes.length) {
LogService.warn(`No files found for path '${startPath}'`);
}
return filteredGlobRes;
}
/**
* @desc Allows the user to provide a providence.conf.js file in its repository root
*/
static getExternalConfig() {
try {
// eslint-disable-next-line import/no-dynamic-require, global-require
return require(`${process.cwd()}/providence.conf.js`);
} catch (_) {
return null;
}
}
}
module.exports = { InputDataService };

View file

@ -0,0 +1,82 @@
const pathLib = require('path');
const chalk = require('chalk');
const ora = require('ora');
const fs = require('fs');
const { log } = console;
function printTitle(title) {
return `${title ? `${title}\n` : ''}`;
}
let spinner;
class LogService {
static debug(text, title) {
if (!this.debugEnabled) return;
log(chalk.bgCyanBright.black.bold(` debug${printTitle(title)}`), text);
this._logHistory.push(`- debug -${printTitle(title)} ${text}`);
}
static warn(text, title) {
log(chalk.bgYellowBright.black.bold(`warning${printTitle(title)}`), text);
this._logHistory.push(`- warning -${printTitle(title)} ${text}`);
}
static error(text, title) {
log(chalk.bgRedBright.black.bold(` error${printTitle(title)}`), text);
this._logHistory.push(`- error -${printTitle(title)} ${text}`);
}
static success(text, title) {
log(chalk.bgGreen.black.bold(`success${printTitle(title)}`), text);
this._logHistory.push(`- success -${printTitle(title)} ${text}`);
}
static info(text, title) {
log(chalk.bgBlue.black.bold(` info${printTitle(title)}`), text);
this._logHistory.push(`- info -${printTitle(title)} ${text}`);
}
static spinnerStart(text) {
spinner = ora(text).start();
}
static spinnerText(text) {
if (!spinner) {
this.spinnerStart(text);
}
spinner.text = text;
}
static spinnerStop() {
spinner.stop();
}
static get spinner() {
return spinner;
}
static pad(str, minChars = 20) {
let result = str;
const padding = minChars - str.length;
if (padding > 0) {
result += ' '.repeat(padding);
}
return result;
}
static writeLogFile() {
const filePath = pathLib.join(process.cwd(), 'providence.log');
let file = `[log ${new Date()}]\n`;
this._logHistory.forEach(l => {
file += `${l}\n`;
});
file += `[/log ${new Date()}]\n\n`;
fs.writeFileSync(filePath, file, { flag: 'a' });
this._logHistory = [];
}
}
LogService.debugEnabled = false;
LogService._logHistory = [];
module.exports = { LogService };

View file

@ -0,0 +1,319 @@
// @ts-ignore-next-line
require('../types/index.js');
const deepmerge = require('deepmerge');
const child_process = require('child_process'); // eslint-disable-line camelcase
const { AstService } = require('./AstService.js');
const { LogService } = require('./LogService.js');
const { getFilePathRelativeFromRoot } = require('../utils/get-file-path-relative-from-root.js');
const astProjectsDataCache = new Map();
class QueryService {
/**
* @param {string} regexString string for 'free' regex searches.
* @returns {QueryConfig}
*/
static getQueryConfigFromRegexSearchString(regexString) {
return { type: 'search', regexString };
}
/**
* @desc Util function that can be used to parse cli input and feed the result object to a new
* instance of QueryResult
* @example
* const queryConfig = QueryService.getQueryConfigFromFeatureString(tg-icon[size=xs])
* const myQueryResult = QueryService.grepSearch(inputData, queryConfig)
* @param {string} queryString - string like tg-icon[size=xs]
* @returns {QueryConfig}
*/
static getQueryConfigFromFeatureString(queryString) {
function parseContains(candidate) {
const hasAsterisk = candidate ? candidate.endsWith('*') : null;
const filtered = hasAsterisk ? candidate.slice(0, -1) : candidate;
return [filtered, hasAsterisk];
}
// Detect the features in the query
let tagCandidate;
let featString;
// Creates tag ('tg-icon') and featString ('font-icon+size=xs')
const match = queryString.match(/(^.*)(\[(.+)\])+/);
if (match) {
// eslint-disable-next-line prefer-destructuring
tagCandidate = match[1];
// eslint-disable-next-line prefer-destructuring
featString = match[3];
} else {
tagCandidate = queryString;
}
const [tag, usesTagPartialMatch] = parseContains(tagCandidate);
let featureObj;
if (featString) {
const [nameCandidate, valueCandidate] = featString.split('=');
const [name, usesValueContains] = parseContains(nameCandidate);
const [value, usesValuePartialMatch] = parseContains(valueCandidate);
featureObj = /** @type {Feature} */ {
name,
value,
tag,
isAttribute: true,
usesValueContains,
usesValuePartialMatch,
usesTagPartialMatch,
};
} else {
// Just look for tag name
featureObj = { tag, usesTagPartialMatch };
}
return { type: 'feature', feature: featureObj };
}
/**
* @desc retrieves the default export found in ./program/analyzers/findImport.js
* @param {string|Analyzer} analyzer
* @returns {QueryConfig}
*/
static getQueryConfigFromAnalyzer(analyzerObjectOrString, analyzerConfig) {
let analyzer;
if (typeof analyzerObjectOrString === 'string') {
// Get it from our location(s) of predefined analyzers.
// Mainly needed when this method is called via cli
try {
// eslint-disable-next-line import/no-dynamic-require, global-require
analyzer = require(`../analyzers/${analyzerObjectOrString}`);
} catch (e) {
LogService.error(e);
process.exit(1);
}
} else {
// We don't need to import the analyzer, since we already have it
analyzer = analyzerObjectOrString;
}
return {
type: 'analyzer',
analyzerName: analyzer.name,
analyzerConfig,
analyzer,
};
}
/**
* @desc Search via unix grep
* @param {InputData} inputData
* @param {QueryConfig} queryConfig
* @param {object} [customConfig]
* @param {boolean} [customConfig.hasVerboseReporting]
* @param {object} [customConfig.gatherFilesConfig]
* @returns {Promise<QueryResult>}
*/
static async grepSearch(inputData, queryConfig, customConfig) {
const cfg = deepmerge(
{
hasVerboseReporting: false,
gatherFilesConfig: {},
},
customConfig,
);
const results = [];
// 1. Analyze the type of query from the QueryConfig (for instance 'feature' or 'search').
let regex;
if (queryConfig.type === 'feature') {
regex = this._getFeatureRegex(queryConfig.feature);
} else if (queryConfig.type === 'search') {
regex = queryConfig.regexString;
}
await Promise.all(
inputData.map(async projectData => {
// 2. For all files found in project, we will do a different grep
const projectResult = {};
const countStdOut = await this._performGrep(projectData.project.path, regex, {
count: true,
gatherFilesConfig: cfg.gatherFilesConfig,
});
projectResult.count = Number(countStdOut);
if (cfg.hasVerboseReporting) {
const detailStdout = await this._performGrep(projectData.project.path, regex, {
count: false,
gatherFilesConfig: cfg.gatherFilesConfig,
});
projectResult.files = detailStdout
.split('\n')
.filter(l => l)
.map(l => {
const [absolutePath, line] = l.split(':');
const file = getFilePathRelativeFromRoot(absolutePath, projectData.path);
const link = l.split(':').slice(0, 2).join(':');
const match = l.split(':').slice(2);
return { file, line: Number(line), match, link };
});
}
results.push({ project: projectData.project, ...projectResult });
}),
);
return /** @type {QueryResult} */ {
meta: {
searchType: 'grep',
query: queryConfig,
},
queryOutput: results,
};
}
/**
* @desc Search via ast (typescript compilation)
* @param {QueryConfig} queryConfig
* @param {AnalyzerConfig} [customConfig]
* @param {GatherFilesConfig} [customConfig.gatherFilesConfig]
* @returns {QueryResult}
*/
static async astSearch(queryConfig, customConfig) {
if (queryConfig.type !== 'analyzer') {
LogService.error('Only analyzers supported for ast searches at the moment');
process.exit(1);
}
// eslint-disable-next-line new-cap
const analyzer = new queryConfig.analyzer();
const analyzerResult = await analyzer.execute(customConfig);
if (!analyzerResult) {
return analyzerResult;
}
const { queryOutput, analyzerMeta } = analyzerResult;
const /** @type {QueryResult} */ queryResult = {
meta: {
searchType: 'ast-analyzer',
analyzerMeta,
},
queryOutput,
};
return queryResult;
}
/**
* @param {ProjectData[]} projectsData
* @param {'babel'|'typescript'|'es-module-lexer'} requiredAst
*/
static async addAstToProjectsData(projectsData, requiredAst) {
return projectsData.map(projectData => {
const cachedData = astProjectsDataCache.get(projectData.project.path);
if (cachedData) {
return cachedData;
}
const resultEntries = projectData.entries.map(entry => {
const ast = AstService.getAst(entry.context.code, requiredAst, { filePath: entry.file });
return { ...entry, ast };
});
const astData = { ...projectData, entries: resultEntries };
this._addToProjectsDataCache(projectData.project.path, astData);
return astData;
});
}
/**
* We need to make sure we don't run into memory issues (ASTs are huge),
* so we only store one project in cache now. This will be a performance benefit for
* lion-based-ui-cli, that runs providence consecutively for the same project
* TODO: instead of storing one result in cache, use sizeof and a memory ;imit
* to allow for more projects
* @param {string} path
* @param {InputData} astData
*/
static _addToProjectsDataCache(path, astData) {
if (this.cacheDisabled) {
return;
}
// In order to prevent running out of memory, there is a limit to the number of
// project ASTs in cache. For a session running multiple analyzers for reference
// and target projects, we need this number to be at least 2.
if (astProjectsDataCache.size >= 2) {
astProjectsDataCache.delete(astProjectsDataCache.keys()[0]);
}
astProjectsDataCache.set(path, astData);
}
/**
* @desc Performs a grep on given path for a certain tag name and feature
* @param {string} searchPath - the project path to search in
* @param {Feature} feature
* @param {object} [customConfig]
* @param {boolean} [customConfig.count] - enable wordcount in grep
* @param {GatherFilesConfig} [customConfig.gatherFilesConfig] - extensions, excludes
* @param {boolean} [customConfig.hasDebugEnabled]
*/
static _getFeatureRegex(feature) {
const { name, value, tag } = feature;
let potentialTag;
if (tag) {
potentialTag = feature.usesTagPartialMatch ? `.*${tag}.+` : tag;
} else {
potentialTag = '.*';
}
let regex;
if (name) {
if (value) {
// We are looking for an exact match: div[class=foo] -> <div class="foo">
let valueRe = value;
if (feature.usesValueContains) {
if (feature.usesValuePartialMatch) {
// We are looking for a partial match: div[class*=foo*] -> <div class="baz foo-bar">
valueRe = `.+${value}.+`;
} else {
// We are looking for an exact match inside a space separated list within an
// attr: div[class*=foo] -> <div class="baz foo bar">
valueRe = `((${value})|("${value} .*)|(.* ${value}")|(.* ${value} .*))`;
}
}
regex = `<${potentialTag} .*${name}="${valueRe}".+>`;
} else {
regex = `<${potentialTag} .*${name}(>|( |=).+>)`;
}
} else if (tag) {
regex = `<${potentialTag} .+>`;
} else {
LogService.error('Please provide a proper Feature');
}
return regex;
}
static _performGrep(searchPath, regex, customConfig) {
const cfg = deepmerge(
{
count: false,
gatherFilesConfig: {},
hasDebugEnabled: false,
},
customConfig,
);
const /** @type {string[]} */ ext = cfg.gatherFilesConfig.extensions;
const include = ext ? `--include="\\.(${ext.map(e => e.slice(1)).join('|')})" ` : '';
const count = cfg.count ? ' | wc -l' : '';
// TODO: test on Linux (only tested on Mac)
const cmd = `pcregrep -ornM ${include} '${regex}' ${searchPath} ${count}`;
if (cfg.hasDebugEnabled) {
LogService.debug(cmd, 'grep command');
}
return new Promise(resolve => {
child_process.exec(cmd, { maxBuffer: 200000000 }, (err, stdout) => {
resolve(stdout);
});
});
}
}
QueryService.cacheDisabled = false;
module.exports = { QueryService };

View file

@ -0,0 +1,102 @@
// @ts-ignore-next-line
require('../types/index.js');
const fs = require('fs');
const pathLib = require('path');
const getHash = require('../utils/get-hash.js');
/**
* @desc Should be used to write results to and read results from the file system.
* Creates a unique identifier based on searchP, refP (optional) and an already created
* @param {object} searchP search target project meta
* @param {object} cfg configuration used for analyzer
* @param {object} [refP] reference project meta
* @returns {string} identifier
*/
function createResultIdentifier(searchP, cfg, refP) {
// why encodeURIComponent: filters out slashes for path names for stuff like @lion/button
const format = p =>
`${encodeURIComponent(p.name)}_${p.version || (p.commitHash && p.commitHash.slice(0, 5))}`;
const cfgHash = getHash(cfg);
return `${format(searchP)}${refP ? `_+_${format(refP)}` : ''}__${cfgHash}`;
}
class ReportService {
/**
* @desc
* Prints queryResult report to console
* @param {QueryResult} queryResult
*/
static printToConsole(queryResult) {
/* eslint-disable no-console */
console.log('== QUERY: =========');
console.log(JSON.stringify(queryResult.meta, null, 2));
console.log('\n== RESULT: =========');
console.log(JSON.stringify(queryResult.queryOutput, null, 2));
console.log('\n----------------------------------------\n');
/* eslint-enable no-console */
}
/**
* @desc
* Prints queryResult report as JSON to outputPath
* @param {QueryResult} queryResult
* @param {string} [identifier]
* @param {string} [outputPath]
*/
static writeToJson(
queryResult,
identifier = new Date().getTime() / 1000,
outputPath = this.outputPath,
) {
const output = JSON.stringify(queryResult, null, 2);
if (!fs.existsSync(outputPath)) {
fs.mkdirSync(outputPath);
}
const { name } = queryResult.meta.analyzerMeta;
const filePath = this._getResultFileNameAndPath(name, identifier);
fs.writeFileSync(filePath, output, { flag: 'w' });
}
static set outputPath(p) {
this.__outputPath = p;
}
static get outputPath() {
return this.__outputPath || pathLib.join(process.cwd(), '/providence-output');
}
static createIdentifier({ targetProject, referenceProject, analyzerConfig }) {
return createResultIdentifier(targetProject, analyzerConfig, referenceProject);
}
static getCachedResult({ analyzerName, identifier }) {
let cachedResult;
try {
cachedResult = JSON.parse(
fs.readFileSync(this._getResultFileNameAndPath(analyzerName, identifier), 'utf-8'),
);
// eslint-disable-next-line no-empty
} catch (_) {}
return cachedResult;
}
static _getResultFileNameAndPath(name, identifier) {
return pathLib.join(this.outputPath, `${name || 'query'}_-_${identifier}.json`);
}
static writeEntryToSearchTargetDepsFile(depProj, rootProjectMeta) {
const rootProj = `${rootProjectMeta.name}#${rootProjectMeta.version}`;
const filePath = pathLib.join(this.outputPath, 'search-target-deps-file.json');
let file = {};
try {
file = JSON.parse(fs.readFileSync(filePath, 'utf-8'));
// eslint-disable-next-line no-empty
} catch (_) {}
const deps = [...(file[rootProj] || []), depProj];
file[rootProj] = [...new Set(deps)];
fs.writeFileSync(filePath, JSON.stringify(file, null, 2), { flag: 'w' });
}
}
module.exports = { ReportService };

View file

@ -0,0 +1,57 @@
/**
* @typedef {Object} Feature
* @property {string} [name] the name of the feature. For instance 'size'
* @property {string} [value] the value of the feature. For instance 'xl'
* @property {string} [memberOf] the name of the object this feature belongs to.
*
* @property {string} [tag] the HTML element it belongs to. Will be used in html
* queries. This option will take precedence over 'memberOf' when configured
* @property {boolean} [isAttribute] useful for HTML queries explicitly looking for attribute
* name instead of property name. When false(default), query searches for properties
* @property {boolean} [usesValueContains] when the attribute value is not an exact match
* @property {boolean} [usesValuePartialMatch] when looking for a partial match:
* div[class*=foo*] -> <div class="baz foo-bar">
* @property {boolean} [usesTagPartialMatch] when looking for an exact match inside a space
* separated list within an attr: div[class*=foo] -> <div class="baz foo bar">
*/
/**
* @typedef {Object} QueryResult result of a query. For all projects and files, gives the
* result of the query.
* @property {Object} QueryResult.meta
* @property {'ast'|'grep'} QueryResult.meta.searchType
* @property {QueryConfig} QueryResult.meta.query
* @property {Object[]} QueryResult.results
* @property {string} QueryResult.queryOutput[].project project name as determined by InputDataService (based on folder name)
* @property {number} QueryResult.queryOutput[].count
* @property {Object[]} [QueryResult.queryOutput[].files]
* @property {string} QueryResult.queryOutput[].files[].file
* @property {number} QueryResult.queryOutput[].files[].line
* @property {string} QueryResult.queryOutput[].files[].match
*/
/**
* @typedef {object} QueryConfig an object containing keys name, value, term, tag
* @property {string} QueryConfig.type the type of the tag we are searching for.
* A certain type has an additional property with more detailed information about the type
* @property {Feature} feature query details for a feature search
*/
/**
* @typedef {Object} InputDataProject - all files found that are queryable
* @property {string} InputDataProject.project - the project name
* @property {string} InputDataProject.path - the path to the project
* @property {string[]} InputDataProject.entries - array of paths that are found within 'project' that
* comply to the rules as configured in 'gatherFilesConfig'
*/
/**
* @typedef {InputDataProject[]} InputData - all files found that are queryable
*/
/**
* @typedef {Object} GatherFilesConfig
* @property {string[]} [extensions] file extension like ['.js', '.html']
* @property {string[]} [excludeFiles] file names filtered out
* @property {string[]} [excludeFolders] folder names filtered outs
*/

View file

@ -0,0 +1,41 @@
/**
* @desc Readable way to do an async forEach
* Since predictability mathers, all array items will be handled in a queue;
* one after anotoher
* @param {array} array
* @param {function} callback
*/
async function aForEach(array, callback) {
for (let i = 0; i < array.length; i += 1) {
// eslint-disable-next-line no-await-in-loop
await callback(array[i], i);
}
}
/**
* @desc Readable way to do an async forEach
* Since predictability mathers, all array items will be handled in a queue;
* one after anotoher
* @param {array} array
* @param {function} callback
*/
async function aForEachNonSequential(array, callback) {
return Promise.all(array.map(callback));
}
/**
* @desc Readable way to do an async map
* Since predictability is crucial for a map, all array items will be handled in a queue;
* one after anotoher
* @param {array} array
* @param {function} callback
*/
async function aMap(array, callback) {
const mappedResults = [];
for (let i = 0; i < array.length; i += 1) {
// eslint-disable-next-line no-await-in-loop
const resolvedCb = await callback(array[i], i);
mappedResults.push(resolvedCb);
}
return mappedResults;
}
module.exports = { aForEach, aMap, aForEachNonSequential };

View file

@ -0,0 +1,12 @@
/**
* @desc relative path of analyzed file, realtive to project root of analyzed project
* - from: '/my/machine/details/analyzed-project/relevant/file.js'
* - to: './relevant/file.js'
* @param {string} absolutePath
* @param {string} projectRoot
*/
function getFilePathRelativeFromRoot(absolutePath, projectRoot) {
return absolutePath.replace(projectRoot, '.');
}
module.exports = { getFilePathRelativeFromRoot };

View file

@ -0,0 +1,19 @@
/**
*
* @param {string|object} inputValue
* @returns {number}
*/
function getHash(inputValue) {
if (typeof inputValue === 'object') {
// eslint-disable-next-line no-param-reassign
inputValue = JSON.stringify(inputValue);
}
return inputValue.split('').reduce(
(prevHash, currVal) =>
// eslint-disable-next-line no-bitwise
((prevHash << 5) - prevHash + currVal.charCodeAt(0)) | 0,
0,
);
}
module.exports = getHash;

View file

@ -0,0 +1,125 @@
/* eslint-disable */
/**
* The MIT License (MIT)
*
* Copyright (c) 2015 Ryo Maruyama
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
// From: https://github.com/esdoc/esdoc/blob/master/src/Parser/CommentParser.js
/**
* Doc Comment Parser class.
*
* @example
* for (let comment of node.leadingComments) {
* let tags = CommentParser.parse(comment);
* console.log(tags);
* }
*/
class JsdocCommentParser {
/**
* parse comment to tags.
* @param {ASTNode} commentNode - comment node.
* @param {string} commentNode.value - comment body.
* @param {string} commentNode.type - CommentBlock or CommentLine.
* @returns {Tag[]} parsed comment.
*/
static parse(commentNode) {
if (!this.isESDoc(commentNode)) return [];
let comment = commentNode.value;
// TODO: refactor
comment = comment.replace(/\r\n/gm, '\n'); // for windows
comment = comment.replace(/^[\t ]*/gm, ''); // remove line head space
comment = comment.replace(/^\*[\t ]?/, ''); // remove first '*'
comment = comment.replace(/[\t ]$/, ''); // remove last space
comment = comment.replace(/^\*[\t ]?/gm, ''); // remove line head '*'
if (comment.charAt(0) !== '@') comment = `@desc ${comment}`; // auto insert @desc
comment = comment.replace(/[\t ]*$/, ''); // remove tail space.
comment = comment.replace(/```[\s\S]*?```/g, match => match.replace(/@/g, '\\ESCAPED_AT\\')); // escape code in descriptions
comment = comment.replace(/^[\t ]*(@\w+)$/gm, '$1 \\TRUE'); // auto insert tag text to non-text tag (e.g. @interface)
comment = comment.replace(/^[\t ]*(@\w+)[\t ](.*)/gm, '\\Z$1\\Z$2'); // insert separator (\\Z@tag\\Ztext)
const lines = comment.split('\\Z');
let tagName = '';
let tagValue = '';
const tags = [];
for (let i = 0; i < lines.length; i++) {
const line = lines[i];
if (line.charAt(0) === '@') {
tagName = line;
const nextLine = lines[i + 1];
if (nextLine.charAt(0) === '@') {
tagValue = '';
} else {
tagValue = nextLine;
i++;
}
tagValue = tagValue
.replace('\\TRUE', '')
.replace(/\\ESCAPED_AT\\/g, '@')
.replace(/^\n/, '')
.replace(/\n*$/, '');
tags.push({ tagName, tagValue });
}
}
return tags;
}
/**
* parse node to tags.
* @param {ASTNode} node - node.
* @returns {{tags: Tag[], commentNode: CommentNode}} parsed comment.
*/
static parseFromNode(node) {
if (!node.leadingComments) node.leadingComments = [{ type: 'CommentBlock', value: '' }];
const commentNode = node.leadingComments[node.leadingComments.length - 1];
const tags = this.parse(commentNode);
return { tags, commentNode };
}
/**
* judge doc comment or not.
* @param {ASTNode} commentNode - comment node.
* @returns {boolean} if true, this comment node is doc comment.
*/
static isESDoc(commentNode) {
if (commentNode.type !== 'CommentBlock') return false;
return commentNode.value.charAt(0) === '*';
}
/**
* build comment from tags
* @param {Tag[]} tags
* @returns {string} block comment value.
*/
static buildComment(tags) {
return tags.reduce((comment, tag) => {
const line = tag.tagValue.replace(/\n/g, '\n * ');
return `${comment} * ${tag.tagName} \n * ${line} \n`;
}, '*\n');
}
}
module.exports = JsdocCommentParser;

View file

@ -0,0 +1,23 @@
// import htm from 'htm';
const htm = require('htm');
function convertToObj(type, props, ...children) {
return { type, props, children };
}
/**
* @desc
* Used for parsing lit-html templates inside ASTs
* @returns {type, props, children}
*
* @example
* litToObj`<h1 .id=${'hello'}>Hello world!</h1>`;
* // {
* // type: 'h1',
* // props: { .id: 'hello' },
* // children: ['Hello world!']
* // }
*/
const litToObj = htm.bind(convertToObj);
module.exports = litToObj;

View file

@ -0,0 +1,34 @@
function memoize(func, externalStorage) {
const storage = externalStorage || {};
// eslint-disable-next-line func-names
return function () {
// eslint-disable-next-line prefer-rest-params
const args = [...arguments];
if (args in storage) {
return storage[args];
}
const outcome = func.apply(this, args);
storage[args] = outcome;
return outcome;
};
}
function memoizeAsync(func, externalStorage) {
const storage = externalStorage || {};
// eslint-disable-next-line func-names
return async function () {
// eslint-disable-next-line prefer-rest-params
const args = [...arguments];
if (args in storage) {
return storage[args];
}
const outcome = await func.apply(this, args);
storage[args] = outcome;
return outcome;
};
}
module.exports = {
memoize,
memoizeAsync,
};

View file

@ -0,0 +1,222 @@
/* eslint-disable */
/**
* This is a modified version of https://github.com/npm/read-package-tree/blob/master/rpt.js
* The original is meant for npm dependencies only. In our (rare) case, we have a hybrid landscape
* where we also want to look for npm dependencies inside bower dependencies (bower_components folder).
*
* Original: https://github.com/npm/read-package-tree
*
* The ISC License
*
* Copyright (c) Isaac Z. Schlueter and Contributors
*
* Permission to use, copy, modify, and/or distribute this software for any
* purpose with or without fee is hereby granted, provided that the above
* copyright notice and this permission notice appear in all copies.
*
* THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
* WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
* MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
* ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
* WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
* ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR
* IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
*/
const fs = require('fs');
/* istanbul ignore next */
const promisify = require('util').promisify || require('util-promisify');
const { resolve, basename, dirname, join } = require('path');
const rpj = promisify(require('read-package-json'));
const readdir = promisify(require('readdir-scoped-modules'));
const realpath = require('read-package-tree/realpath.js');
let ID = 0;
class Node {
constructor(pkg, logical, physical, er, cache) {
// should be impossible.
const cached = cache.get(physical);
/* istanbul ignore next */
if (cached && !cached.then) throw new Error('re-creating already instantiated node');
cache.set(physical, this);
const parent = basename(dirname(logical));
if (parent.charAt(0) === '@') this.name = `${parent}/${basename(logical)}`;
else this.name = basename(logical);
this.path = logical;
this.realpath = physical;
this.error = er;
this.id = ID++;
this.package = pkg || {};
this.parent = null;
this.isLink = false;
this.children = [];
}
}
class Link extends Node {
constructor(pkg, logical, physical, realpath, er, cache) {
super(pkg, logical, physical, er, cache);
// if the target has started, but not completed, then
// a Promise will be in the cache to indicate this.
const cachedTarget = cache.get(realpath);
if (cachedTarget && cachedTarget.then)
cachedTarget.then(node => {
this.target = node;
this.children = node.children;
});
this.target = cachedTarget || new Node(pkg, logical, realpath, er, cache);
this.realpath = realpath;
this.isLink = true;
this.error = er;
this.children = this.target.children;
}
}
// this is the way it is to expose a timing issue which is difficult to
// test otherwise. The creation of a Node may take slightly longer than
// the creation of a Link that targets it. If the Node has _begun_ its
// creation phase (and put a Promise in the cache) then the Link will
// get a Promise as its cachedTarget instead of an actual Node object.
// This is not a problem, because it gets resolved prior to returning
// the tree or attempting to load children. However, it IS remarkably
// difficult to get to happen in a test environment to verify reliably.
// Hence this kludge.
const newNode = (pkg, logical, physical, er, cache) =>
process.env._TEST_RPT_SLOW_LINK_TARGET_ === '1'
? new Promise(res => setTimeout(() => res(new Node(pkg, logical, physical, er, cache)), 10))
: new Node(pkg, logical, physical, er, cache);
const loadNode = (logical, physical, cache, rpcache, stcache) => {
// cache temporarily holds a promise placeholder so we
// don't try to create the same node multiple times.
// this is very rare to encounter, given the aggressive
// caching on fs.realpath and fs.lstat calls, but
// it can happen in theory.
const cached = cache.get(physical);
/* istanbul ignore next */
if (cached) return Promise.resolve(cached);
const p = realpath(physical, rpcache, stcache, 0).then(
real =>
rpj(join(real, 'package.json'))
.then(
pkg => [pkg, null],
er => [null, er],
)
.then(([pkg, er]) =>
physical === real
? newNode(pkg, logical, physical, er, cache)
: new Link(pkg, logical, physical, real, er, cache),
),
// if the realpath fails, don't bother with the rest
er => new Node(null, logical, physical, er, cache),
);
cache.set(physical, p);
return p;
};
const loadChildren = (node, cache, filterWith, rpcache, stcache, mode) => {
// if a Link target has started, but not completed, then
// a Promise will be in the cache to indicate this.
//
// XXX When we can one day loadChildren on the link *target* instead of
// the link itself, to match real dep resolution, then we may end up with
// a node target in the cache that isn't yet done resolving when we get
// here. For now, though, this line will never be reached, so it's hidden
//
// if (node.then)
// return node.then(node => loadChildren(node, cache, filterWith, rpcache, stcache))
let depFolder = 'node_modules';
if (mode === 'bower') {
// TODO: if people rename their bower_components folder to smth like "lib", please handle
depFolder = 'bower_components';
try {
const bowerrc = JSON.parse(fs.readFileSync(join(node.path, '.bowerrc')));
if (bowerrc && bowerrc.directory) {
depFolder = bowerrc.directory;
}
} catch (_) {}
}
const nm = join(node.path, depFolder);
// const nm = join(node.path, 'bower_components')
return realpath(nm, rpcache, stcache, 0)
.then(rm => readdir(rm).then(kids => [rm, kids]))
.then(([rm, kids]) =>
Promise.all(
kids
.filter(kid => kid.charAt(0) !== '.' && (!filterWith || filterWith(node, kid)))
.map(kid => loadNode(join(nm, kid), join(rm, kid), cache, rpcache, stcache)),
),
)
.then(kidNodes => {
kidNodes.forEach(k => (k.parent = node));
node.children.push.apply(
node.children,
kidNodes.sort((a, b) =>
(a.package.name ? a.package.name.toLowerCase() : a.path).localeCompare(
b.package.name ? b.package.name.toLowerCase() : b.path,
),
),
);
return node;
})
.catch(() => node);
};
const loadTree = (node, did, cache, filterWith, rpcache, stcache, mode) => {
// impossible except in pathological ELOOP cases
/* istanbul ignore next */
if (did.has(node.realpath)) return Promise.resolve(node);
did.add(node.realpath);
// load children on the target, not the link
return loadChildren(node, cache, filterWith, rpcache, stcache, mode)
.then(node =>
Promise.all(
node.children
.filter(kid => !did.has(kid.realpath))
.map(kid => loadTree(kid, did, cache, filterWith, rpcache, stcache, mode)),
),
)
.then(() => node);
};
// XXX Drop filterWith and/or cb in next semver major bump
/**
*
* @param {*} root
* @param {*} filterWith
* @param {*} cb
* @param {'npm'|'bower'} [mode='npm'] if mode is 'bower', will look in 'bower_components' instead
* of 'node_modules'
*/
const rpt = (root, filterWith, cb, mode = 'npm') => {
if (!cb && typeof filterWith === 'function') {
cb = filterWith;
filterWith = null;
}
const cache = new Map();
// we can assume that the cwd is real enough
const cwd = process.cwd();
const rpcache = new Map([[cwd, cwd]]);
const stcache = new Map();
const p = realpath(root, rpcache, stcache, 0)
.then(realRoot => loadNode(root, realRoot, cache, rpcache, stcache))
.then(node => loadTree(node, new Set(), cache, filterWith, rpcache, stcache, mode));
if (typeof cb === 'function') p.then(tree => cb(null, tree), cb);
return p;
};
rpt.Node = Node;
rpt.Link = Link;
module.exports = rpt;

View file

@ -0,0 +1,24 @@
/**
* @desc determines for a source path of an import- or export specifier, whether
* it is relative (an internal import/export) or absolute (external)
* - relative: './helpers', './helpers.js', '../helpers.js'
* - not relative: '@open-wc/helpers', 'project-x/helpers'
* @param {string} source source path of an import- or export specifier
* @returns {boolean}
*/
function isRelativeSourcePath(source) {
return source.startsWith('.');
}
/**
* @desc Simple helper te make code a bit more readable.
* - from '/path/to/repo/my/file.js';
* - to './my/file.js'
* @param {string} fullPath like '/path/to/repo/my/file.js'
* @param {string} rootPath like '/path/to/repo'
*/
function toRelativeSourcePath(fullPath, rootPath) {
return fullPath.replace(rootPath, '.');
}
module.exports = { isRelativeSourcePath, toRelativeSourcePath };

View file

@ -0,0 +1,50 @@
/**
* Solution inspired by es-dev-server:
* https://github.com/open-wc/open-wc/blob/master/packages/es-dev-server/src/utils/resolve-module-imports.js
*/
const pathLib = require('path');
const nodeResolvePackageJson = require('@rollup/plugin-node-resolve/package.json');
const createRollupResolve = require('@rollup/plugin-node-resolve');
const { LogService } = require('../services/LogService.js');
const fakePluginContext = {
meta: {
rollupVersion: nodeResolvePackageJson.peerDependencies.rollup,
},
warn(...msg) {
LogService.warn('[resolve-import-path]: ', ...msg);
},
};
/**
* @desc based on importee (in a statement "import {x} from '@lion/core'", "@lion/core" is an
* importee), which can be a bare module specifier, a filename without extension, or a folder
* name without an extension.
* @param {string} importee source like '@lion/core'
* @param {string} importer importing file, like '/my/project/importing-file.js'
* @returns {string} the resolved file system path, like '/my/project/node_modules/@lion/core/index.js'
*/
async function resolveImportPath(importee, importer, opts = {}) {
const rollupResolve = createRollupResolve({
rootDir: pathLib.dirname(importer),
// allow resolving polyfills for nodejs libs
preferBuiltins: false,
// extensions: ['.mjs', '.js', '.json', '.node'],
...opts,
});
const preserveSymlinks =
(opts && opts.customResolveOptions && opts.customResolveOptions.preserveSymlinks) || false;
rollupResolve.buildStart.call(fakePluginContext, { preserveSymlinks });
const result = await rollupResolve.resolveId.call(fakePluginContext, importee, importer);
if (!result || !result.id) {
// throw new Error(`importee ${importee} not found in filesystem.`);
LogService.warn(`importee ${importee} not found in filesystem for importer '${importer}'.`);
return null;
}
return result.id;
}
module.exports = { resolveImportPath };

View file

@ -0,0 +1,28 @@
/**
* @param {ASTNode} curNode Node to start from. Will loop over its children
* @param {object} processObject Will be executed for every node
* @param {ASTNode} [parentNode] parent of curNode
*/
function traverseHtml(curNode, processObject) {
function pathify(node) {
return {
node,
traverse(obj) {
traverseHtml(node, obj);
},
};
}
// let done = processFn(curNode, parentNode);
if (processObject[curNode.nodeName]) {
processObject[curNode.nodeName](pathify(curNode));
}
if (curNode.childNodes) {
curNode.childNodes.forEach(childNode => {
traverseHtml(childNode, processObject, curNode);
});
}
}
module.exports = traverseHtml;

View file

@ -0,0 +1,57 @@
const { LogService } = require('../src/program/services/LogService.js');
const originalWarn = LogService.warn;
function suppressWarningLogs() {
LogService.warn = () => {};
}
function restoreSuppressWarningLogs() {
LogService.warn = originalWarn;
}
const originalInfo = LogService.info;
function suppressInfoLogs() {
LogService.info = () => {};
}
function restoreSuppressInfoLogs() {
LogService.info = originalInfo;
}
const originalDebug = LogService.debug;
function suppressDebugLogs() {
LogService.debug = () => {};
}
function restoreSuppressDebugLogs() {
LogService.debug = originalDebug;
}
const originalSuccess = LogService.success;
function suppressSuccessLogs() {
LogService.success = () => {};
}
function restoreSuppressSuccessLogs() {
LogService.success = originalSuccess;
}
function suppressNonCriticalLogs() {
suppressInfoLogs();
suppressWarningLogs();
suppressDebugLogs();
suppressSuccessLogs();
}
function restoreSuppressNonCriticalLogs() {
restoreSuppressInfoLogs();
restoreSuppressWarningLogs();
restoreSuppressDebugLogs();
restoreSuppressSuccessLogs();
}
module.exports = {
suppressWarningLogs,
restoreSuppressWarningLogs,
suppressInfoLogs,
restoreSuppressInfoLogs,
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
};

View file

@ -0,0 +1,125 @@
// eslint-disable-next-line import/no-extraneous-dependencies
const mockFs = require('mock-fs');
const path = require('path');
/**
* @desc Makes sure that, whenever the main program (providence) calls
* "InputDataService.createDataObject", it gives back a mocked response.
* @param {string[]|object} files all the code that will be run trhough AST
* @param {object} [cfg]
* @param {string} [cfg.project='fictional-project']
* @param {string} [cfg.projectPath='/fictional/project']
* @param {string[]} [cfg.filePath=`/fictional/project/test-file-${i}.js`] The indexes of the file
* paths match with the indexes of the files
* @param {object} existingMock config for mock-fs, so the previous config is not overridden
*/
function mockProject(files, cfg = {}, existingMock = {}) {
const projName = cfg.projectName || 'fictional-project';
const projPath = cfg.projectPath || '/fictional/project';
// Create obj structure for mock-fs
// eslint-disable-next-line no-shadow
function createFilesObjForFolder(files) {
let projFilesObj = {};
if (Array.isArray(files)) {
projFilesObj = files.reduce((res, code, i) => {
const fileName = (cfg.filePaths && cfg.filePaths[i]) || `./test-file-${i}.js`;
const localFileName = path.resolve(projPath, fileName);
res[localFileName] = code;
return res;
}, {});
} else {
Object.keys(files).forEach(f => {
const localFileName = path.resolve(projPath, f);
projFilesObj[localFileName] = files[f];
});
}
return projFilesObj;
}
const optionalPackageJson = {};
const hasPackageJson = cfg.filePaths && cfg.filePaths.includes('./package.json');
if (!hasPackageJson) {
optionalPackageJson[projPath] = {
'package.json': `{ "name": "${projName}" , "version": "${cfg.version || '0.1.0-mock'}" }`,
};
}
const totalMock = {
...existingMock, // can only add to mock-fs, not expand existing config?
...optionalPackageJson,
...createFilesObjForFolder(files),
};
mockFs(totalMock);
return totalMock;
}
function restoreMockedProjects() {
mockFs.restore();
}
function getEntry(queryResult, index = 0) {
return queryResult.queryOutput[index];
}
function getEntries(queryResult) {
return queryResult.queryOutput;
}
/**
* Requires two config objects (see match-imports and match-subclasses tests)
* and based on those, will use mock-fs package to mock them in the file system.
* All missing information (like target depending on ref, version numbers, project names
* and paths will be auto generated when not specified.)
* When a non imported ref dependency or a wrong version of a dev dependency needs to be
* tested, please explicitly provide a ./package.json that does so.
*/
function mockTargetAndReferenceProject(searchTargetProject, referenceProject) {
const targetProjectName = searchTargetProject.name || 'fictional-target-project';
const refProjectName = referenceProject.name || 'fictional-ref-project';
const targetcodeSnippets = searchTargetProject.files.map(f => f.code);
const targetFilePaths = searchTargetProject.files.map(f => f.file);
const refVersion = referenceProject.version || '1.0.0';
const targetHasPackageJson = targetFilePaths.includes('./package.json');
// Make target depend on ref
if (!targetHasPackageJson) {
targetcodeSnippets.push(`{
"name": "${targetProjectName}" ,
"version": "1.0.0",
"dependencies": {
"${refProjectName}": "${refVersion}"
}
}`);
targetFilePaths.push('./package.json');
}
// Create target mock
const targetMock = mockProject(targetcodeSnippets, {
filePaths: targetFilePaths,
projectName: targetProjectName,
projectPath: searchTargetProject.path || 'fictional/target/project',
});
// Append ref mock
mockProject(
referenceProject.files.map(f => f.code),
{
filePaths: referenceProject.files.map(f => f.file),
projectName: refProjectName,
projectPath: referenceProject.path || 'fictional/ref/project',
version: refVersion,
},
targetMock,
);
}
module.exports = {
mockProject,
restoreMockedProjects,
getEntry,
getEntries,
mockTargetAndReferenceProject,
};

View file

@ -0,0 +1,21 @@
const { ReportService } = require('../src/program/services/ReportService.js');
const originalWriteToJson = ReportService.writeToJson;
function mockWriteToJson(queryResults) {
ReportService.writeToJson = queryResult => {
queryResults.push(queryResult);
};
}
function restoreWriteToJson(queryResults) {
ReportService.writeToJson = originalWriteToJson;
while (queryResults && queryResults.length) {
queryResults.pop();
}
}
module.exports = {
mockWriteToJson,
restoreWriteToJson,
};

View file

@ -0,0 +1,219 @@
{
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "find-classes",
"requiredAst": "babel",
"identifier": "importing-target-project_0.0.2-target-mock__-297820780",
"targetProject": {
"mainEntry": "./target-src/match-imports/root-level-imports.js",
"name": "importing-target-project",
"version": "0.0.2-target-mock",
"commitHash": "[not-a-git-root]"
},
"configuration": {
"gatherFilesConfig": {},
"metaConfig": null
}
}
},
"queryOutput": [
{
"file": "./target-src/find-customelements/multiple.js",
"result": [
{
"name": null,
"isMixin": true,
"superClasses": [
{
"name": "HTMLElement",
"isMixin": false,
"rootFile": {
"file": "[current]",
"specifier": "HTMLElement"
}
}
],
"members": {
"props": [],
"methods": []
}
},
{
"name": "ExtendedOnTheFly",
"isMixin": false,
"superClasses": [
{
"isMixin": true,
"rootFile": {
"file": "[current]"
}
},
{
"isMixin": false,
"rootFile": {
"file": "[current]"
}
}
],
"members": {
"props": [],
"methods": []
}
}
]
},
{
"file": "./target-src/match-subclasses/ExtendedComp.js",
"result": [
{
"name": "ExtendedComp",
"isMixin": false,
"superClasses": [
{
"name": "MyCompMixin",
"isMixin": true,
"rootFile": {
"file": "exporting-ref-project",
"specifier": "[default]"
}
},
{
"name": "RefClass",
"isMixin": false,
"rootFile": {
"file": "exporting-ref-project",
"specifier": "RefClass"
}
}
],
"members": {
"props": [
{
"name": "getterSetter",
"accessType": "public",
"kind": [
"get",
"set"
]
},
{
"name": "staticGetterSetter",
"accessType": "public",
"static": true,
"kind": [
"get",
"set"
]
},
{
"name": "attributes",
"accessType": "public",
"static": true,
"kind": [
"get"
]
},
{
"name": "styles",
"accessType": "public",
"static": true,
"kind": [
"get"
]
},
{
"name": "updateComplete",
"accessType": "public",
"kind": [
"get"
]
},
{
"name": "localizeNamespaces",
"accessType": "public",
"static": true,
"kind": [
"get"
]
},
{
"name": "slots",
"accessType": "public",
"kind": [
"get"
]
}
],
"methods": [
{
"name": "method",
"accessType": "public"
},
{
"name": "_protectedMethod",
"accessType": "protected"
},
{
"name": "__privateMethod",
"accessType": "private"
},
{
"name": "$protectedMethod",
"accessType": "protected"
},
{
"name": "$$privateMethod",
"accessType": "private"
},
{
"name": "constructor",
"accessType": "public"
},
{
"name": "connectedCallback",
"accessType": "public"
},
{
"name": "disconnectedCallback",
"accessType": "public"
},
{
"name": "_requestUpdate",
"accessType": "protected"
},
{
"name": "createRenderRoot",
"accessType": "public"
},
{
"name": "render",
"accessType": "public"
},
{
"name": "updated",
"accessType": "public"
},
{
"name": "firstUpdated",
"accessType": "public"
},
{
"name": "update",
"accessType": "public"
},
{
"name": "shouldUpdate",
"accessType": "public"
},
{
"name": "onLocaleUpdated",
"accessType": "public"
}
]
}
}
]
}
]
}

View file

@ -0,0 +1,50 @@
{
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "find-customelements",
"requiredAst": "babel",
"identifier": "importing-target-project_0.0.2-target-mock__-2006922104",
"targetProject": {
"mainEntry": "./target-src/match-imports/root-level-imports.js",
"name": "importing-target-project",
"version": "0.0.2-target-mock",
"commitHash": "[not-a-git-root]"
},
"configuration": {
"gatherFilesConfig": {}
}
}
},
"queryOutput": [
{
"file": "./target-src/find-customelements/multiple.js",
"result": [
{
"tagName": "ref-class",
"constructorIdentifier": "RefClass",
"rootFile": {
"file": "exporting-ref-project",
"specifier": "RefClass"
}
},
{
"tagName": "extended-comp",
"constructorIdentifier": "ExtendedComp",
"rootFile": {
"file": "./target-src/match-subclasses/ExtendedComp.js",
"specifier": "ExtendedComp"
}
},
{
"tagName": "on-the-fly",
"constructorIdentifier": "[inline]",
"rootFile": {
"file": "[current]",
"specifier": "[inline]"
}
}
]
}
]
}

View file

@ -0,0 +1,195 @@
{
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "find-exports",
"requiredAst": "babel",
"identifier": "exporting-ref-project_1.0.0__-1083884764",
"targetProject": {
"mainEntry": "./index.js",
"name": "exporting-ref-project",
"version": "1.0.0",
"commitHash": "[not-a-git-root]"
},
"configuration": {
"metaConfig": null,
"gatherFilesConfig": {}
}
}
},
"queryOutput": [
{
"file": "./index.js",
"result": [
{
"exportSpecifiers": [
"[default]"
],
"source": "refConstImported",
"normalizedSource": "refConstImported",
"rootFileMap": [
{
"currentFileSpecifier": "[default]",
"rootFile": {
"file": "refConstImported",
"specifier": "[default]"
}
}
]
},
{
"exportSpecifiers": [
"RefClass",
"RefRenamedClass"
],
"localMap": [
{
"local": "RefClass",
"exported": "RefRenamedClass"
}
],
"source": "./ref-src/core.js",
"normalizedSource": "./ref-src/core.js",
"rootFileMap": [
{
"currentFileSpecifier": "RefClass",
"rootFile": {
"file": "./ref-src/core.js",
"specifier": "RefClass"
}
},
{
"currentFileSpecifier": "RefRenamedClass",
"rootFile": {
"file": "./ref-src/core.js",
"specifier": "RefClass"
}
}
]
},
{
"exportSpecifiers": [
"[file]"
],
"rootFileMap": [
null
]
}
]
},
{
"file": "./not-imported.js",
"result": [
{
"exportSpecifiers": [
"notImported"
],
"localMap": [],
"source": null,
"rootFileMap": [
{
"currentFileSpecifier": "notImported",
"rootFile": {
"file": "[current]",
"specifier": "notImported"
}
}
]
},
{
"exportSpecifiers": [
"[file]"
],
"rootFileMap": [
null
]
}
]
},
{
"file": "./ref-component.js",
"result": [
{
"exportSpecifiers": [
"[file]"
],
"rootFileMap": [
null
]
}
]
},
{
"file": "./ref-src/core.js",
"result": [
{
"exportSpecifiers": [
"RefClass"
],
"localMap": [],
"source": null,
"rootFileMap": [
{
"currentFileSpecifier": "RefClass",
"rootFile": {
"file": "[current]",
"specifier": "RefClass"
}
}
]
},
{
"exportSpecifiers": [
"[default]"
],
"rootFileMap": [
{
"currentFileSpecifier": "[default]",
"rootFile": {
"file": "[current]",
"specifier": "[default]"
}
}
]
},
{
"exportSpecifiers": [
"[file]"
],
"rootFileMap": [
null
]
}
]
},
{
"file": "./ref-src/folder/index.js",
"result": [
{
"exportSpecifiers": [
"resolvePathCorrect"
],
"localMap": [],
"source": null,
"rootFileMap": [
{
"currentFileSpecifier": "resolvePathCorrect",
"rootFile": {
"file": "[current]",
"specifier": "resolvePathCorrect"
}
}
]
},
{
"exportSpecifiers": [
"[file]"
],
"rootFileMap": [
null
]
}
]
}
]
}

View file

@ -0,0 +1,202 @@
{
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "find-imports",
"requiredAst": "babel",
"identifier": "importing-target-project_0.0.2-target-mock__139587347",
"targetProject": {
"mainEntry": "./target-src/match-imports/root-level-imports.js",
"name": "importing-target-project",
"version": "0.0.2-target-mock",
"commitHash": "[not-a-git-root]"
},
"configuration": {
"keepInternalSources": false,
"gatherFilesConfig": {}
}
}
},
"queryOutput": [
{
"file": "./target-src/find-customelements/multiple.js",
"result": [
{
"importSpecifiers": [
"RefClass"
],
"source": "exporting-ref-project",
"normalizedSource": "exporting-ref-project"
}
]
},
{
"file": "./target-src/find-imports/all-notations.js",
"result": [
{
"importSpecifiers": [
"[file]"
],
"source": "imported/source",
"normalizedSource": "imported/source"
},
{
"importSpecifiers": [
"[default]"
],
"source": "imported/source-a",
"normalizedSource": "imported/source-a"
},
{
"importSpecifiers": [
"b"
],
"source": "imported/source-b",
"normalizedSource": "imported/source-b"
},
{
"importSpecifiers": [
"c",
"d"
],
"source": "imported/source-c",
"normalizedSource": "imported/source-c"
},
{
"importSpecifiers": [
"[default]",
"f",
"g"
],
"source": "imported/source-d",
"normalizedSource": "imported/source-d"
},
{
"importSpecifiers": [
"[default]"
],
"source": "my/source-e",
"normalizedSource": "my/source-e"
},
{
"importSpecifiers": [
"[default]"
],
"source": "[variable]",
"normalizedSource": "[variable]"
},
{
"importSpecifiers": [
"[*]"
],
"source": "imported/source-g",
"normalizedSource": "imported/source-g"
}
]
},
{
"file": "./target-src/match-imports/deep-imports.js",
"result": [
{
"importSpecifiers": [
"RefClass"
],
"source": "exporting-ref-project/ref-src/core.js",
"normalizedSource": "exporting-ref-project/ref-src/core.js"
},
{
"importSpecifiers": [
"[default]"
],
"source": "exporting-ref-project/ref-src/core.js",
"normalizedSource": "exporting-ref-project/ref-src/core.js"
},
{
"importSpecifiers": [
"nonMatched"
],
"source": "unknown-project/xyz.js",
"normalizedSource": "unknown-project/xyz.js"
},
{
"importSpecifiers": [
"[file]"
],
"source": "exporting-ref-project/ref-component",
"normalizedSource": "exporting-ref-project/ref-component"
},
{
"importSpecifiers": [
"resolvePathCorrect"
],
"source": "exporting-ref-project/ref-src/folder",
"normalizedSource": "exporting-ref-project/ref-src/folder"
},
{
"importSpecifiers": [
"[*]"
],
"source": "exporting-ref-project/ref-src/core.js",
"normalizedSource": "exporting-ref-project/ref-src/core.js"
}
]
},
{
"file": "./target-src/match-imports/root-level-imports.js",
"result": [
{
"importSpecifiers": [
"RefClass"
],
"source": "exporting-ref-project",
"normalizedSource": "exporting-ref-project"
},
{
"importSpecifiers": [
"RefRenamedClass"
],
"source": "exporting-ref-project",
"normalizedSource": "exporting-ref-project"
},
{
"importSpecifiers": [
"[default]"
],
"source": "exporting-ref-project",
"normalizedSource": "exporting-ref-project"
},
{
"importSpecifiers": [
"nonMatched"
],
"source": "unknown-project",
"normalizedSource": "unknown-project"
}
]
},
{
"file": "./target-src/match-subclasses/ExtendedComp.js",
"result": [
{
"importSpecifiers": [
"RefClass"
],
"source": "exporting-ref-project",
"normalizedSource": "exporting-ref-project"
}
]
},
{
"file": "./target-src/match-subclasses/internalProxy.js",
"result": [
{
"importSpecifiers": [
"[default]"
],
"source": "exporting-ref-project",
"normalizedSource": "exporting-ref-project"
}
]
}
]
}

View file

@ -0,0 +1,158 @@
{
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "match-imports",
"requiredAst": "babel",
"identifier": "importing-target-project_0.0.2-target-mock_+_exporting-ref-project_1.0.0__453069400",
"targetProject": {
"mainEntry": "./target-src/match-imports/root-level-imports.js",
"name": "importing-target-project",
"version": "0.0.2-target-mock",
"commitHash": "[not-a-git-root]"
},
"referenceProject": {
"mainEntry": "./index.js",
"name": "exporting-ref-project",
"version": "1.0.0",
"commitHash": "[not-a-git-root]"
},
"configuration": {
"gatherFilesConfig": {}
}
}
},
"queryOutput": [
{
"exportSpecifier": {
"name": "[default]",
"project": "exporting-ref-project",
"filePath": "./index.js",
"id": "[default]::./index.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/root-level-imports.js",
"./target-src/match-subclasses/internalProxy.js"
]
}
]
},
{
"exportSpecifier": {
"name": "RefClass",
"project": "exporting-ref-project",
"filePath": "./index.js",
"id": "RefClass::./index.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/find-customelements/multiple.js",
"./target-src/match-imports/root-level-imports.js",
"./target-src/match-subclasses/ExtendedComp.js"
]
}
]
},
{
"exportSpecifier": {
"name": "RefRenamedClass",
"project": "exporting-ref-project",
"filePath": "./index.js",
"id": "RefRenamedClass::./index.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/root-level-imports.js"
]
}
]
},
{
"exportSpecifier": {
"name": "[file]",
"project": "exporting-ref-project",
"filePath": "./ref-component.js",
"id": "[file]::./ref-component.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/deep-imports.js"
]
}
]
},
{
"exportSpecifier": {
"name": "RefClass",
"project": "exporting-ref-project",
"filePath": "./ref-src/core.js",
"id": "RefClass::./ref-src/core.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/deep-imports.js"
]
}
]
},
{
"exportSpecifier": {
"name": "[default]",
"project": "exporting-ref-project",
"filePath": "./ref-src/core.js",
"id": "[default]::./ref-src/core.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/deep-imports.js"
]
}
]
},
{
"exportSpecifier": {
"name": "[file]",
"project": "exporting-ref-project",
"filePath": "./ref-src/core.js",
"id": "[file]::./ref-src/core.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/deep-imports.js"
]
}
]
},
{
"exportSpecifier": {
"name": "resolvePathCorrect",
"project": "exporting-ref-project",
"filePath": "./ref-src/folder/index.js",
"id": "resolvePathCorrect::./ref-src/folder/index.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
"./target-src/match-imports/deep-imports.js"
]
}
]
}
]
}

View file

@ -0,0 +1,92 @@
{
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "match-paths",
"requiredAst": "babel",
"identifier": "importing-target-project_0.0.2-target-mock_+_exporting-ref-project_1.0.0__-238486383",
"targetProject": {
"mainEntry": "./target-src/match-imports/root-level-imports.js",
"name": "importing-target-project",
"version": "0.0.2-target-mock",
"commitHash": "[not-a-git-root]"
},
"referenceProject": {
"mainEntry": "./index.js",
"name": "exporting-ref-project",
"version": "1.0.0",
"commitHash": "[not-a-git-root]"
},
"configuration": {
"gatherFilesConfig": {},
"prefix": null
}
}
},
"queryOutput": [
{
"name": "[default]",
"variable": {
"from": "[default]",
"to": "ExtendedComp",
"paths": [
{
"from": "./index.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
},
{
"from": "./ref-src/core.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
},
{
"from": "exporting-ref-project/index.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
},
{
"from": "exporting-ref-project/ref-src/core.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
}
]
}
},
{
"name": "RefClass",
"variable": {
"from": "RefClass",
"to": "ExtendedComp",
"paths": [
{
"from": "./index.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
},
{
"from": "./ref-src/core.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
},
{
"from": "exporting-ref-project/index.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
},
{
"from": "exporting-ref-project/ref-src/core.js",
"to": "./target-src/match-subclasses/ExtendedComp.js"
}
]
},
"tag": {
"from": "ref-component",
"to": "extended-comp",
"paths": [
{
"from": "./ref-component.js",
"to": "./target-src/find-customelements/multiple.js"
},
{
"from": "exporting-ref-project/ref-component.js",
"to": "./target-src/find-customelements/multiple.js"
}
]
}
}
]
}

View file

@ -0,0 +1,65 @@
{
"meta": {
"searchType": "ast-analyzer",
"analyzerMeta": {
"name": "match-subclasses",
"requiredAst": "babel",
"identifier": "importing-target-project_0.0.2-target-mock_+_exporting-ref-project_1.0.0__453069400",
"targetProject": {
"mainEntry": "./target-src/match-imports/root-level-imports.js",
"name": "importing-target-project",
"version": "0.0.2-target-mock",
"commitHash": "[not-a-git-root]"
},
"referenceProject": {
"mainEntry": "./index.js",
"name": "exporting-ref-project",
"version": "1.0.0",
"commitHash": "[not-a-git-root]"
},
"configuration": {
"gatherFilesConfig": {}
}
}
},
"queryOutput": [
{
"exportSpecifier": {
"name": "[default]",
"project": "exporting-ref-project",
"filePath": "./index.js",
"id": "[default]::./index.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
{
"file": "./target-src/match-subclasses/ExtendedComp.js",
"identifier": "ExtendedComp"
}
]
}
]
},
{
"exportSpecifier": {
"name": "RefClass",
"project": "exporting-ref-project",
"filePath": "./index.js",
"id": "RefClass::./index.js::exporting-ref-project"
},
"matchesPerProject": [
{
"project": "importing-target-project",
"files": [
{
"file": "./target-src/match-subclasses/ExtendedComp.js",
"identifier": "ExtendedComp"
}
]
}
]
}
]
}

View file

@ -0,0 +1,12 @@
# Project mocks
The number of project-mocks is kept to a minimum:
- one target project: "./importing-target-project"
- one reference project: "./importing-target-project/node_modules/exporting-ref-project"
Whenever new Analyzers are added, please make sure the needed ingredients for a proper
end to end test are added to one of the above projects (or both).
Be sure to update 'test-helpers/project-mocks-analyzer-output'.
This can be done by running `npm run test:e2e -- --generate-e2e-mode` once.

View file

@ -0,0 +1 @@
!node_modules/

View file

@ -0,0 +1,8 @@
/* eslint-disable */
// re-exported default specifier
import refConstImported from './ref-src/core.js';
export default refConstImported;
// re-exported specifier
export { RefClass, RefClass as RefRenamedClass } from './ref-src/core.js';

View file

@ -0,0 +1,2 @@
// this file will not be included by "importing-target-project" defined below
export const notImported = null;

View file

@ -0,0 +1,5 @@
{
"name": "exporting-ref-project",
"version": "1.0.0",
"main": "./index.js"
}

View file

@ -0,0 +1,4 @@
// global effects
import { RefClass } from './ref-src/core.js';
customElements.define('ref-component', RefClass);

View file

@ -0,0 +1,11 @@
/* eslint-disable */
// named specifier
export class RefClass extends HTMLElement {
methodToInherit() {}
};
// default specifier
export default superclass => class MyMixin extends superclass {
mixinMethodToInherit() {}
};

View file

@ -0,0 +1,3 @@
// this file (and thus this export) should be resolved via
// [import 'exporting-ref-project/ref-src/folder']
export const resolvePathCorrect = null;

View file

@ -0,0 +1,8 @@
{
"name": "importing-target-project",
"version": "0.0.2-target-mock",
"main": "./target-src/match-imports/root-level-imports.js",
"dependencies": {
"exporting-ref-project": "^1.0.0"
}
}

View file

@ -0,0 +1,18 @@
/* eslint-disable max-classes-per-file */
import { RefClass } from 'exporting-ref-project';
import { ExtendedComp } from '../match-subclasses/ExtendedComp.js';
// external
customElements.define('ref-class', RefClass);
// internal (+ via window and inside CallExpression)
(() => {
window.customElements.define('extended-comp', ExtendedComp);
})();
// direct class (not supported atm)
// To connect this to a constructor, we should also detect customElements.get()
customElements.define('on-the-fly', class extends HTMLElement {});
// eslint-disable-next-line no-unused-vars
class ExtendedOnTheFly extends customElements.get('on-the-fly') {}

View file

@ -0,0 +1,24 @@
/* eslint-disable */
// ImportDeclaration without specifiers
import 'imported/source';
// ImportDeclaration with default specifier
import a from 'imported/source-a';
// ImportDeclaration with named specifier
import { b } from 'imported/source-b';
// ImportDeclaration with multiple named specifiers
import { c, d } from 'imported/source-c';
// ImportDeclaration with default and named specifiers
import e, { f, g } from 'imported/source-d';
// Internal file import
import '../match-imports/deep-imports'; // Notice extension is missing, will be auto resolved
// Dynamic import
import('my/source-e');
// Dynamic import with variables. TODO: how to handle?
const variable = 'f';
import(`my/source${variable}`);
// namespaced
import * as all from 'imported/source-g';

View file

@ -0,0 +1,27 @@
/* eslint-disable */
// a direct named import
import { RefClass } from 'exporting-ref-project/ref-src/core.js';
// a direct default import
import refConst from 'exporting-ref-project/ref-src/core.js';
// should not be found
import { nonMatched } from 'unknown-project/xyz.js';
/**
* Examples below should be resolved to the proper filepath (filename + extension)
* (direct or indirect is not relevant in this case, it is about the source and not the
* specifier)
*/
// Two things:
// - a file with side effects
// - should resolve "as file", to 'exporting-ref-project/ref-component.js'
import 'exporting-ref-project/ref-component';
// - should resolve "as folder", to 'exporting-ref-project/ref-src/folder/index.js'
import { resolvePathCorrect } from 'exporting-ref-project/ref-src/folder';
// should match all exportSpecifiers from 'exporting-ref-project/ref-src/core.js'
import * as all from 'exporting-ref-project/ref-src/core.js';

View file

@ -0,0 +1,13 @@
/* eslint-disable */
// named import (indirect, needs transitivity check)
import { RefClass } from 'exporting-ref-project';
// renamed import (indirect, needs transitivity check)
import { RefRenamedClass } from 'exporting-ref-project';
// default (indirect, needs transitivity check)
import refConstImported from 'exporting-ref-project';
// should not be found
import { nonMatched } from 'unknown-project';

View file

@ -0,0 +1,46 @@
/* eslint-disable */
import { RefClass } from 'exporting-ref-project';
import MyCompMixin from './internalProxy.js';
export class ExtendedComp extends MyCompMixin(RefClass) {
/**
* Whitelisted members
*/
get getterSetter() {}
set getterSetter(v) {}
static get staticGetterSetter() {}
static set staticGetterSetter(v) {}
method() {}
_protectedMethod() {}
__privateMethod() {}
$protectedMethod() {}
$$privateMethod() {}
/**
* Blacklisted platform methods ands props by find-classes
*/
static get attributes() {}
constructor() {}
connectedCallback() {}
disconnectedCallback() {}
/**
* Blacklisted LitElement methods ands props by find-classes
*/
static get properties() {}
static get styles() {}
get updateComplete() {}
_requestUpdate() {}
createRenderRoot() {}
render() {}
updated() {}
firstUpdated() {}
update() {}
shouldUpdate() {}
/**
* Blacklisted Lion methods and props by find-classes
*/
static get localizeNamespaces() {}
get slots() {}
onLocaleUpdated() {}
}

View file

@ -0,0 +1,4 @@
/* eslint-disable */
import MyCompMixin from 'exporting-ref-project';
export default MyCompMixin;

View file

@ -0,0 +1,107 @@
const { Analyzer } = require('../../src/program/analyzers/helpers/Analyzer.js');
/**
* This file outlines the minimum required functionality for an analyzer.
* Whenever a new analyzer is created, this file can serve as a guideline on how to do this.
* For 'match-analyzers' (having requiresReference: true), please look in the analyzers folder for
* an example
*/
/**
* Everything that is configured via {AnalyzerConfig} [customConfig] in the execute
* function, should be configured here
*/
const options = {
optionA(entryResult) {
// here, we perform a transformation on the entryResult
return entryResult;
},
};
/**
* This file takes the output of one AST (or 'program'), which
* corresponds to one file.
* The contents of this function should be designed in such a way that they
* can be directly pasted and edited in https://astexplorer.net/
* @param {BabelAST} ast
* @returns {TransformedEntry}
*/
// eslint-disable-next-line no-unused-vars
function myAnalyzerPerAstEntry(ast) {
// Visit AST...
const transformedEntryResult = [];
// Do the traverse: https://babeljs.io/docs/en/babel-traverse
// Inside of ypur traverse function, add when there is a match wrt intended analysis
transformedEntryResult.push({ matched: 'entry' });
return transformedEntryResult;
}
class MyAnalyzer extends Analyzer {
constructor() {
super();
/**
* This must match with the name in file-system (will be used for reporting)
*/
this.name = 'my-analyzer';
/**
* The ast format that the execute function expects
* Compatible with formats supported by AstService.getAst()
*/
this.requiredAst = 'babel';
/**
* Not all analyzers require a references. Those that do, (usually 'match analyzers'),
* must explicitly state so with `requiresReference: true`
*/
}
/**
* @param {AstDataProject[]} astDataProjects
* @param {AnalyzerConfig} [customConfig]
* @returns {QueryResult}
*/
async execute(customConfig = {}) {
const cfg = {
targetProjectPaths: null,
optionA: false,
optionB: '',
...customConfig,
};
/**
* Prepare
*/
const analyzerResult = this._prepare(cfg);
if (analyzerResult) {
return analyzerResult;
}
/**
* Traverse
*/
const queryOutput = await this._traverse((ast, astContext) => {
// Run the traversel per entry
let transformedEntryResult = myAnalyzerPerAstEntry(ast);
const meta = {};
// (optional): Post processors on TransformedEntry
if (cfg.optionA) {
// Run entry transformation based on option A
transformedEntryResult = options.optionA(astContext);
}
return { result: transformedEntryResult, meta };
});
// (optional): Post processors on TransformedQueryResult
if (cfg.optionB) {
// Run your QueryResult transformation based on option B
}
/**
* Finalize
*/
return this._finalize(queryOutput, cfg);
}
}
module.exports = MyAnalyzer;

View file

@ -0,0 +1,44 @@
const /** @type {PostProcessorOptions} */ options = {
optionA(transformedResult) {
return transformedResult;
},
};
/**
*
* @param {AnalyzerResult} analyzerResult
* @param {FindImportsConfig} customConfig
* @returns {AnalyzerResult}
*/
function myPostProcessor(analyzerResult, customConfig) {
const cfg = {
optionFoo: null,
...customConfig,
};
let transformedResult = analyzerResult.map(({ entries, project }) => {
// eslint-disable-next-line no-unused-vars
const projectName = project.name;
return entries.map(entry =>
entry.result.map(resultForEntry => ({
transformed: resultForEntry.foo,
output: resultForEntry.bar,
})),
);
});
if (cfg.optionA) {
transformedResult = options.optionA(transformedResult);
}
return /** @type {AnalyzerResult} */ transformedResult;
}
module.exports = {
name: 'my-post-processor',
execute: myPostProcessor,
compatibleAnalyzers: ['analyzer-template'],
// This means it transforms the result output of an analyzer, and multiple
// post processors cannot be chained after this one
modifiesOutputStructure: true,
};

View file

@ -0,0 +1,98 @@
const sinon = require('sinon');
const pathLib = require('path');
const { expect } = require('chai');
const {
mockProject,
// restoreMockedProjects,
} = require('../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../test-helpers/mock-log-service-helpers.js');
const { spawnProcess } = require('../../src/cli/cli-helpers.js');
const { QueryService } = require('../../src/program/services/QueryService.js');
const providenceModule = require('../../src/program/providence.js');
const dummyAnalyzer = require('../../test-helpers/templates/analyzer-template.js');
const queryResults = [];
describe('Providence CLI', () => {
before(() => {
suppressNonCriticalLogs();
mockWriteToJson(queryResults);
});
after(() => {
restoreSuppressNonCriticalLogs();
restoreWriteToJson();
});
mockProject(
{
'./src/OriginalComp.js': `export class OriginalComp {}`,
'./src/inbetween.js': `export { OriginalComp as InBetweenComp } from './OriginalComp.js'`,
'./index.js': `export { InBetweenComp as MyComp } from './src/inbetween.js'`,
},
{
project: 'example-project',
path: '/mocked/path',
},
);
const rootDir = pathLib.resolve(__dirname, '../../');
async function cli(args) {
return spawnProcess(`node ./src/cli/index.js ${args}`, { cwd: rootDir });
}
async function cliAnalyze(args) {
return spawnProcess(`node ./src/cli/index.js analyze find-exports ${args}`, { cwd: rootDir });
}
it('creates a QueryConfig', async () => {
const stub = sinon.stub(QueryService, 'getQueryConfigFromAnalyzer');
await cliAnalyze('-t "/mocked/path/example-project"');
expect(stub.args[0]).to.equal('find-exports');
});
it('calls providence', async () => {
const providenceStub = sinon.stub(providenceModule, 'providence');
await cliAnalyze('-t "/mocked/path/example-project"');
expect(providenceStub).to.have.been.called;
});
describe('Global options', () => {
it('"-e --extensions"', async () => {
const providenceStub = sinon.stub(providenceModule, 'providence');
await cli('--extensions ".bla, .blu"');
expect(providenceStub.args[1].gatherFilesConfig.extensions).to.eql(['bla', 'blu']);
});
it('"-t", "--search-target-paths"', async () => {});
it('"-r", "--reference-paths"', async () => {});
it('"--search-target-collection"', async () => {});
it('"--reference-collection"', async () => {});
it.skip('"-R --verbose-report"', async () => {});
it.skip('"-D", "--debug"', async () => {});
});
describe('Commands', () => {
describe('Analyze', () => {
it('calls providence', async () => {
expect(typeof dummyAnalyzer.name).to.equal('string');
});
describe('Options', () => {
it('"-o", "--prompt-optional-config"', async () => {});
it('"-c", "--config"', async () => {});
});
});
describe('Query', () => {});
describe('Search', () => {});
describe('Manage', () => {});
});
});

View file

@ -0,0 +1,228 @@
const { expect } = require('chai');
const {
// mockTargetAndReferenceProject,
mockProject,
restoreMockedProjects,
} = require('../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../test-helpers/mock-log-service-helpers.js');
const { QueryService } = require('../../src/program/services/QueryService.js');
const { providence } = require('../../src/program/providence.js');
const dummyAnalyzer = require('../../test-helpers/templates/analyzer-template.js');
const queryResults = [];
describe('Analyzer', () => {
before(() => {
suppressNonCriticalLogs();
mockWriteToJson(queryResults);
});
after(() => {
restoreSuppressNonCriticalLogs();
restoreWriteToJson(queryResults);
});
describe('Public api', () => {
it('has a "name" string', async () => {
expect(typeof dummyAnalyzer.name).to.equal('string');
});
it('has an "execute" function', async () => {
expect(typeof dummyAnalyzer.execute).to.equal('function');
});
it('has a "requiredAst" string', async () => {
expect(typeof dummyAnalyzer.requiredAst).to.equal('string');
const allowedAsts = ['babel', 'typescript', 'es-module-lexer'];
expect(allowedAsts).to.include(dummyAnalyzer.requiredAst);
});
it('has a "requiresReference" boolean', async () => {
expect(typeof dummyAnalyzer.requiresReference).to.equal('boolean');
});
});
describe('Find Analyzers', async () => {
afterEach(() => {
restoreMockedProjects();
});
// Our configuration object
const myQueryConfigObject = QueryService.getQueryConfigFromAnalyzer(dummyAnalyzer);
mockProject([`const validJs = true;`, `let invalidJs = false;`], {
projectName: 'my-project',
projectPath: '/path/to/my-project',
filePaths: ['./test-file1.js', './test-file2.js'],
});
await providence(myQueryConfigObject, {
targetProjectPaths: ['/path/to/my-project'],
});
describe('Prepare phase', () => {
it('looks for a cached result', async () => {});
it('exposes a ".targetMeta" object', async () => {});
it('exposes a ".targetData" object', async () => {});
it('exposes a ".identifier" string', async () => {});
});
describe('Traverse phase', () => {});
describe('Finalize phase', () => {
it('returns an AnalyzerResult', async () => {
const queryResult = queryResults[0];
const { queryOutput, meta } = queryResult;
expect(queryOutput[0]).to.eql({
file: './test-file1.js',
meta: {},
result: [{ matched: 'entry' }],
});
expect(queryOutput[1]).to.eql({
file: './test-file2.js',
meta: {},
result: [{ matched: 'entry' }],
});
// Local machine info needs to be deleted, so that results are always 'machine agnostic'
// (which is needed to share cached json results via git)
expect(meta).to.eql({
searchType: 'ast-analyzer',
analyzerMeta: {
name: 'my-analyzer',
requiredAst: 'babel',
identifier: 'my-project_0.1.0-mock__542516121',
targetProject: {
name: 'my-project',
commitHash: '[not-a-git-repo]',
version: '0.1.0-mock',
},
configuration: {
targetProjectPaths: null,
optionA: false,
optionB: '',
debugEnabled: false,
gatherFilesConfig: {},
},
},
});
});
});
// TODO: think of exposing the ast traversal part in a distinct method "traverse", so we can
// create integrations with (a local version of) https://astexplorer.net
});
// describe.skip('Match Analyzers', () => {
// const referenceProject = {
// path: '/exporting/ref/project',
// name: 'exporting-ref-project',
// files: [
// {
// file: './package.json',
// code: `{
// "name": "importing-target-project",
// "version": "2.20.3",
// "dependencies": {
// "exporting-ref-project": "^2.3.0"
// }
// }`,
// },
// ],
// };
// const matchingTargetProject = {
// path: '/importing/target/project/v10',
// files: [
// {
// file: './package.json',
// code: `{
// "name": "importing-target-project",
// "version": "10.1.2",
// "dependencies": {
// "exporting-ref-project": "^2.3.0"
// }
// }`,
// },
// ],
// };
// const matchingDevDepTargetProject = {
// path: '/importing/target/project/v10',
// files: [
// {
// file: './package.json',
// code: `{
// "name": "importing-target-project",
// "version": "10.1.2",
// "devDependencies": {
// "exporting-ref-project": "^2.3.0"
// }
// }`,
// },
// ],
// };
// // A previous version that does not match our reference version
// const nonMatchingVersionTargetProject = {
// path: '/importing/target/project/v8',
// files: [
// {
// file: './package.json',
// code: `{
// "name": "importing-target-project",
// "version": "8.1.2",
// "dependencies": {
// "exporting-ref-project": "^1.9.0"
// }
// }`,
// },
// ],
// };
// const nonMatchingDepTargetProject = {
// path: '/importing/target/project/v8',
// files: [
// {
// file: './package.json',
// code: `{
// "name": "importing-target-project",
// "version": "8.1.2",
// "dependencies": {
// "some-other-project": "^0.1.0"
// }
// }`,
// },
// ],
// };
// it('has a "requiresReference" boolean', async () => {
// expect(dummyAnalyzer.requiresReference).to.equal(true);
// });
// describe('Prepare phase', () => {
// it('halts non-compatible reference + target combinations', async () => {
// mockTargetAndReferenceProject(referenceProject, nonMatchingVersionTargetProject);
// // Check stubbed LogService.info with reason 'no-matched-version'
// mockTargetAndReferenceProject(referenceProject, nonMatchingDepTargetProject);
// // Check stubbed LogService.info with reason 'no-dependency'
// });
// it('starts analysis for compatible reference + target combinations', async () => {
// mockTargetAndReferenceProject(referenceProject, matchingTargetProject);
// mockTargetAndReferenceProject(referenceProject, matchingDevDepTargetProject);
// // _prepare: startAnalysis: true
// });
// });
// });
});

View file

@ -0,0 +1,130 @@
const pathLib = require('path');
const { expect } = require('chai');
const { providence } = require('../../../../src/program/providence.js');
const { QueryService } = require('../../../../src/program/services/QueryService.js');
const { ReportService } = require('../../../../src/program/services/ReportService.js');
const { LogService } = require('../../../../src/program/services/LogService.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../../test-helpers/mock-log-service-helpers.js');
describe('Analyzers file-system integration', () => {
before(() => {
suppressNonCriticalLogs();
});
after(() => {
restoreSuppressNonCriticalLogs();
});
const generateE2eMode = process.argv.includes('--generate-e2e-mode');
const queryResults = [];
const targetPath = pathLib.resolve(
__dirname,
'../../../../test-helpers/project-mocks/importing-target-project',
);
const referencePath = pathLib.resolve(
__dirname,
`../../../../test-helpers/project-mocks/importing-target-project/node_modules/exporting-ref-project`,
);
const originalGetResultFileNameAndPath = ReportService._getResultFileNameAndPath;
const originalOutputPath = ReportService.outputPath;
after(() => {
ReportService._getResultFileNameAndPath = originalGetResultFileNameAndPath;
ReportService.outputPath = originalOutputPath;
});
if (generateE2eMode) {
ReportService.outputPath = pathLib.resolve(
__dirname,
'../../../../test-helpers/project-mocks-analyzer-outputs',
);
// eslint-disable-next-line func-names
ReportService._getResultFileNameAndPath = function (name) {
return pathLib.join(this.outputPath, `${name}.json`);
};
} else {
ReportService.outputPath = __dirname; // prevents cache to fail the test
beforeEach(() => {
mockWriteToJson(queryResults);
});
afterEach(() => {
restoreWriteToJson(queryResults);
});
}
const analyzers = [
{
analyzerName: 'find-customelements',
providenceConfig: {
targetProjectPaths: [targetPath],
},
},
{
analyzerName: 'find-imports',
providenceConfig: {
targetProjectPaths: [targetPath],
},
},
{
analyzerName: 'find-exports',
providenceConfig: {
targetProjectPaths: [referencePath],
},
},
{
analyzerName: 'find-classes',
providenceConfig: {
targetProjectPaths: [targetPath],
},
},
{
analyzerName: 'match-imports',
providenceConfig: {
targetProjectPaths: [targetPath],
referenceProjectPaths: [referencePath],
},
},
{
analyzerName: 'match-subclasses',
providenceConfig: {
targetProjectPaths: [targetPath],
referenceProjectPaths: [referencePath],
},
},
{
analyzerName: 'match-paths',
providenceConfig: {
targetProjectPaths: [targetPath],
referenceProjectPaths: [referencePath],
},
},
];
for (const { analyzerName, providenceConfig } of analyzers) {
it(`"${analyzerName}" analyzer`, async () => {
const findExportsQueryConfig = QueryService.getQueryConfigFromAnalyzer(analyzerName);
await providence(findExportsQueryConfig, providenceConfig);
if (generateE2eMode) {
LogService.info(
'Successfully created mocks. Do not forget to rerun tests now without "--generate-e2e-mode"',
);
return;
}
// eslint-disable-next-line import/no-dynamic-require, global-require
const expectedOutput = require(`../../../../test-helpers/project-mocks-analyzer-outputs/${analyzerName}.json`);
const queryResult = JSON.parse(JSON.stringify(queryResults[0])).queryOutput;
expect(queryResult).to.eql(expectedOutput.queryOutput);
});
}
});

View file

@ -0,0 +1,247 @@
const { expect } = require('chai');
const { providence } = require('../../../src/program/providence.js');
const { QueryService } = require('../../../src/program/services/QueryService.js');
const {
mockProject,
restoreMockedProjects,
getEntry,
} = require('../../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../test-helpers/mock-log-service-helpers.js');
const findClassesQueryConfig = QueryService.getQueryConfigFromAnalyzer('find-classes');
describe('Analyzer "find-classes"', () => {
const queryResults = [];
const _providenceCfg = {
targetProjectPaths: ['/fictional/project'], // defined in mockProject
};
const cacheDisabledInitialValue = QueryService.cacheDisabled;
before(() => {
QueryService.cacheDisabled = true;
});
after(() => {
QueryService.cacheDisabled = cacheDisabledInitialValue;
});
beforeEach(() => {
suppressNonCriticalLogs();
mockWriteToJson(queryResults);
});
afterEach(() => {
restoreSuppressNonCriticalLogs();
restoreWriteToJson(queryResults);
restoreMockedProjects();
});
it(`finds class definitions`, async () => {
mockProject([`class EmptyClass {}`]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result).to.eql([
{
name: 'EmptyClass',
isMixin: false,
members: {
methods: [],
props: [],
},
},
]);
});
it(`finds mixin definitions`, async () => {
mockProject([`const m = superclass => class MyMixin extends superclass {}`]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result).to.eql([
{
name: 'MyMixin',
superClasses: [
{
isMixin: false,
name: 'superclass',
rootFile: { file: '[current]', specifier: 'superclass' },
},
],
isMixin: true,
members: {
methods: [],
props: [],
},
},
]);
});
it(`stores superClasses`, async () => {
mockProject({
'./index.js': `
import { Mixin } from '@external/source';
class OtherClass {}
export class EmptyClass extends Mixin(OtherClass) {}
`,
'./internal.js': '',
});
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[1].superClasses).to.eql([
{
isMixin: true,
name: 'Mixin',
rootFile: { file: '@external/source', specifier: 'Mixin' },
},
{
isMixin: false,
name: 'OtherClass',
rootFile: { file: '[current]', specifier: 'OtherClass' },
},
]);
});
it(`handles multiple classes per file`, async () => {
mockProject([
` const m = superclass => class MyMixin extends superclass {}
class EmptyClass extends Mixin(OtherClass) {}`,
]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result.length).to.equal(2);
});
describe('Members', () => {
it(`stores methods`, async () => {
mockProject([
`class MyClass {
method() {}
_protectedMethod() {}
__privateMethod() {}
$protectedMethod() {}
$$privateMethod() {}
}`,
]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].members.methods).to.eql([
{
accessType: 'public',
name: 'method',
},
{
accessType: 'protected',
name: '_protectedMethod',
},
{
accessType: 'private',
name: '__privateMethod',
},
{
accessType: 'protected',
name: '$protectedMethod',
},
{
accessType: 'private',
name: '$$privateMethod',
},
]);
});
it(`stores props`, async () => {
mockProject([
`class MyClass {
get getterSetter() {}
set getterSetter(v) {}
static get _staticGetterSetter() {}
static set _staticGetterSetter(v) {}
}`,
]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].members.props).to.eql([
{
accessType: 'public',
kind: ['get', 'set'],
name: 'getterSetter',
},
{
accessType: 'protected',
kind: ['get', 'set'],
name: '_staticGetterSetter',
static: true,
},
]);
});
// Options below are disabled by default for now.
// TODO: provide as options
it.skip(`filters out platform members`, async () => {
mockProject([
`class MyClass {
static get attributes() {}
constructor() {}
connectedCallback() {}
disconnectedCallback() {}
}`,
]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].members.methods.length).to.equal(0);
expect(firstEntry.result[0].members.props.length).to.equal(0);
});
it.skip(`filters out LitElement members`, async () => {
mockProject([
`class MyClass {
static get properties() {}
static get styles() {}
get updateComplete() {}
_requestUpdate() {}
createRenderRoot() {}
render() {}
updated() {}
firstUpdated() {}
update() {}
shouldUpdate() {}
}`,
]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].members.methods.length).to.equal(0);
expect(firstEntry.result[0].members.props.length).to.equal(0);
});
it.skip(`filters out Lion members`, async () => {
mockProject([
`class MyClass {
static get localizeNamespaces() {}
get slots() {}
onLocaleUpdated() {}
}`,
]);
await providence(findClassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].members.methods.length).to.equal(0);
expect(firstEntry.result[0].members.props.length).to.equal(0);
});
});
});

View file

@ -0,0 +1,141 @@
const { expect } = require('chai');
const { providence } = require('../../../src/program/providence.js');
const { QueryService } = require('../../../src/program/services/QueryService.js');
const {
mockProject,
restoreMockedProjects,
getEntry,
} = require('../../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../test-helpers/mock-log-service-helpers.js');
const findCustomelementsQueryConfig = QueryService.getQueryConfigFromAnalyzer(
'find-customelements',
);
const _providenceCfg = {
targetProjectPaths: ['/fictional/project'], // defined in mockProject
};
describe('Analyzer "find-customelements"', () => {
const queryResults = [];
const cacheDisabledInitialValue = QueryService.cacheDisabled;
before(() => {
QueryService.cacheDisabled = true;
});
after(() => {
QueryService.cacheDisabled = cacheDisabledInitialValue;
});
beforeEach(() => {
suppressNonCriticalLogs();
mockWriteToJson(queryResults);
});
afterEach(() => {
restoreSuppressNonCriticalLogs();
restoreMockedProjects();
restoreWriteToJson(queryResults);
});
it(`stores the tagName of a custom element`, async () => {
mockProject([`customElements.define('custom-el', class extends HTMLElement {});`]);
await providence(findCustomelementsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].tagName).to.equal('custom-el');
});
it(`allows different notations for defining a custom element`, async () => {
mockProject([
`customElements.define('custom-el1', class extends HTMLElement {});`,
`window.customElements.define('custom-el2', class extends HTMLElement {});`,
`(() => {
window.customElements.define('custom-el3', class extends HTMLElement {});
})();`,
]);
await providence(findCustomelementsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
const secondEntry = getEntry(queryResult, 1);
const thirdEntry = getEntry(queryResult, 2);
expect(firstEntry.result[0].tagName).to.equal('custom-el1');
expect(secondEntry.result[0].tagName).to.equal('custom-el2');
expect(thirdEntry.result[0].tagName).to.equal('custom-el3');
});
it(`stores the rootFile of a custom element`, async () => {
mockProject({
'./src/CustomEl.js': `export class CustomEl extends HTMLElement {}`,
'./custom-el.js': `
import { CustomEl } from './src/CustomEl.js';
customElements.define('custom-el', CustomEl);
`,
});
await providence(findCustomelementsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].rootFile).to.eql({
file: './src/CustomEl.js',
specifier: 'CustomEl',
});
});
it(`stores "[inline]" constructors`, async () => {
mockProject([`customElements.define('custom-el', class extends HTMLElement {});`]);
await providence(findCustomelementsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].constructorIdentifier).to.equal('[inline]');
expect(firstEntry.result[0].rootFile.specifier).to.equal('[inline]');
});
it(`stores "[current]" rootFile`, async () => {
mockProject([`customElements.define('custom-el', class extends HTMLElement {});`]);
await providence(findCustomelementsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].rootFile.file).to.equal('[current]');
});
it(`stores the locally exported specifier in the rootFile `, async () => {
mockProject({
'./src/CustomEl.js': `export class CustomEl extends HTMLElement {}`,
'./custom-el.js': `
import { CustomEl } from './src/CustomEl.js';
customElements.define('custom-el', CustomEl);
`,
});
await providence(findCustomelementsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].constructorIdentifier).to.equal('CustomEl');
expect(firstEntry.result[0].rootFile.specifier).to.equal('CustomEl');
});
it(`finds all occurrences of custom elements`, async () => {
mockProject([
`
customElements.define('tag-1', class extends HTMLElement {});
customElements.define('tag-2', class extends HTMLElement {});
`,
`
customElements.define('tag-3', class extends HTMLElement {});
`,
]);
await providence(findCustomelementsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
const secondEntry = getEntry(queryResult, 1);
expect(firstEntry.result.length).to.equal(2);
expect(secondEntry.result.length).to.equal(1);
});
});

View file

@ -0,0 +1,254 @@
const { expect } = require('chai');
const { providence } = require('../../../src/program/providence.js');
const { QueryService } = require('../../../src/program/services/QueryService.js');
const {
mockProject,
restoreMockedProjects,
getEntry,
getEntries,
} = require('../../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../test-helpers/mock-log-service-helpers.js');
const findExportsQueryConfig = QueryService.getQueryConfigFromAnalyzer('find-exports');
describe('Analyzer "find-exports"', () => {
const queryResults = [];
const _providenceCfg = {
targetProjectPaths: ['/fictional/project'], // defined in mockProject
};
const cacheDisabledInitialValue = QueryService.cacheDisabled;
before(() => {
QueryService.cacheDisabled = true;
});
after(() => {
QueryService.cacheDisabled = cacheDisabledInitialValue;
});
beforeEach(() => {
suppressNonCriticalLogs();
mockWriteToJson(queryResults);
});
afterEach(() => {
restoreSuppressNonCriticalLogs();
restoreWriteToJson(queryResults);
restoreMockedProjects();
});
describe('Export notations', () => {
it(`supports [export const x = 0] (named specifier)`, async () => {
mockProject([`export const x = 0`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].exportSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].exportSpecifiers[0]).to.equal('x');
expect(firstEntry.result[0].source).to.be.null;
});
it(`supports [export default class X {}] (default export)`, async () => {
mockProject([`export default class X {}`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].exportSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].exportSpecifiers[0]).to.equal('[default]');
expect(firstEntry.result[0].source).to.equal(undefined);
});
it(`supports [export { x } from 'my/source'] (re-export named specifier)`, async () => {
mockProject([`export { x } from 'my/source'`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].exportSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].exportSpecifiers[0]).to.equal('x');
expect(firstEntry.result[0].source).to.equal('my/source');
});
it(`supports [export { x as y } from 'my/source'] (re-export renamed specifier)`, async () => {
mockProject([`export { x as y } from 'my/source'`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].exportSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].exportSpecifiers[0]).to.equal('y');
expect(firstEntry.result[0].source).to.equal('my/source');
});
it(`stores meta info(local name) of renamed specifiers`, async () => {
mockProject([`export { x as y } from 'my/source'`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
// This info will be relevant later to identify 'transitive' relations
expect(firstEntry.result[0].localMap).to.eql([
{
local: 'x',
exported: 'y',
},
]);
});
it(`supports [export { x, y } from 'my/source'] (multiple re-exported named specifiers)`, async () => {
mockProject([`export { x, y } from 'my/source'`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].exportSpecifiers.length).to.equal(2);
expect(firstEntry.result[0].exportSpecifiers).to.eql(['x', 'y']);
expect(firstEntry.result[0].source).to.equal('my/source');
});
it(`stores rootFileMap of an exported Identifier`, async () => {
mockProject({
'./src/OriginalComp.js': `export class OriginalComp {}`,
'./src/inbetween.js': `export { OriginalComp as InBetweenComp } from './OriginalComp.js'`,
'./index.js': `export { InBetweenComp as MyComp } from './src/inbetween.js'`,
});
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
const secondEntry = getEntry(queryResult, 1);
const thirdEntry = getEntry(queryResult, 2);
expect(firstEntry.result[0].rootFileMap).to.eql([
{
currentFileSpecifier: 'MyComp', // this is the local name in the file we track from
rootFile: {
file: './src/OriginalComp.js', // the file containing declaration
specifier: 'OriginalComp', // the specifier that was exported in file
},
},
]);
expect(secondEntry.result[0].rootFileMap).to.eql([
{
currentFileSpecifier: 'InBetweenComp',
rootFile: {
file: './src/OriginalComp.js',
specifier: 'OriginalComp',
},
},
]);
expect(thirdEntry.result[0].rootFileMap).to.eql([
{
currentFileSpecifier: 'OriginalComp',
rootFile: {
file: '[current]',
specifier: 'OriginalComp',
},
},
]);
});
// TODO: myabe in the future: This experimental syntax requires enabling the parser plugin: 'exportDefaultFrom'
it.skip(`stores rootFileMap of an exported Identifier`, async () => {
mockProject({
'./src/reexport.js': `
// a direct default import
import RefDefault from 'exporting-ref-project';
export RefDefault;
`,
'./index.js': `
export { ExtendRefDefault } from './src/reexport.js';
`,
});
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].rootFileMap).to.eql([
{
currentFileSpecifier: 'ExtendRefDefault',
rootFile: {
file: 'exporting-ref-project',
specifier: '[default]',
},
},
]);
});
});
describe('Export variable types', () => {
it(`classes`, async () => {
mockProject([`export class X {}`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].exportSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].exportSpecifiers[0]).to.equal('X');
expect(firstEntry.result[0].source).to.be.null;
});
it(`functions`, async () => {
mockProject([`export function y() {}`]);
await providence(findExportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].exportSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].exportSpecifiers[0]).to.equal('y');
expect(firstEntry.result[0].source).to.be.null;
});
// ...etc?
// ...TODO: create custom hooks to store meta info about types etc.
});
describe('Default post processing', () => {
// onlyInternalSources: false,
// keepOriginalSourcePaths: false,
// filterSpecifier: null,
});
describe('Options', () => {
// TODO: Move to dashboard
it.skip(`"metaConfig.categoryConfig"`, async () => {
mockProject(
[
`export const foo = null`, // firstEntry
`export const bar = null`, // secondEntry
`export const baz = null`, // thirdEntry
],
{
projectName: 'my-project',
filePaths: ['./foo.js', './packages/bar/test/bar.test.js', './temp/baz.js'],
},
);
const findExportsCategoryQueryObj = QueryService.getQueryConfigFromAnalyzer('find-exports', {
metaConfig: {
categoryConfig: [
{
project: 'my-project',
categories: {
fooCategory: localFilePath => localFilePath.startsWith('./foo'),
barCategory: localFilePath => localFilePath.startsWith('./packages/bar'),
testCategory: localFilePath => localFilePath.includes('/test/'),
},
},
],
},
});
await providence(findExportsCategoryQueryObj, _providenceCfg);
const queryResult = queryResults[0];
const [firstEntry, secondEntry, thirdEntry] = getEntries(queryResult);
expect(firstEntry.meta.categories).to.eql(['fooCategory']);
// not mutually exclusive...
expect(secondEntry.meta.categories).to.eql(['barCategory', 'testCategory']);
expect(thirdEntry.meta.categories).to.eql([]);
});
});
});

View file

@ -0,0 +1,347 @@
const { expect } = require('chai');
const { providence } = require('../../../src/program/providence.js');
const { QueryService } = require('../../../src/program/services/QueryService.js');
const {
mockProject,
restoreMockedProjects,
getEntry,
} = require('../../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../test-helpers/mock-log-service-helpers.js');
const findImportsQueryConfig = QueryService.getQueryConfigFromAnalyzer('find-imports');
const _providenceCfg = {
targetProjectPaths: ['/fictional/project'], // defined in mockProject
};
describe('Analyzer "find-imports"', () => {
const queryResults = [];
const cacheDisabledInitialValue = QueryService.cacheDisabled;
before(() => {
QueryService.cacheDisabled = true;
});
after(() => {
QueryService.cacheDisabled = cacheDisabledInitialValue;
});
beforeEach(() => {
suppressNonCriticalLogs();
mockWriteToJson(queryResults);
});
afterEach(() => {
restoreSuppressNonCriticalLogs();
restoreMockedProjects();
restoreWriteToJson(queryResults);
});
describe('Import notations', () => {
it(`supports [import 'imported/source'] (no specifiers)`, async () => {
mockProject([`import 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers).to.eql(['[file]']);
expect(firstEntry.result[0].source).to.equal('imported/source');
});
it(`supports [import x from 'imported/source'] (default specifier)`, async () => {
mockProject([`import x from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('[default]');
expect(firstEntry.result[0].source).to.equal('imported/source');
});
it(`supports [import { x } from 'imported/source'] (named specifier)`, async () => {
mockProject([`import { x } from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('x');
expect(firstEntry.result[0].importSpecifiers[1]).to.equal(undefined);
expect(firstEntry.result[0].source).to.equal('imported/source');
});
it(`supports [import { x, y } from 'imported/source'] (multiple named specifiers)`, async () => {
mockProject([`import { x, y } from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('x');
expect(firstEntry.result[0].importSpecifiers[1]).to.equal('y');
expect(firstEntry.result[0].importSpecifiers[2]).to.equal(undefined);
expect(firstEntry.result[0].source).to.equal('imported/source');
});
it(`supports [import x, { y, z } from 'imported/source'] (default and named specifiers)`, async () => {
mockProject([`import x, { y, z } from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('[default]');
expect(firstEntry.result[0].importSpecifiers[1]).to.equal('y');
expect(firstEntry.result[0].importSpecifiers[2]).to.equal('z');
expect(firstEntry.result[0].source).to.equal('imported/source');
});
it(`supports [import { x as y } from 'imported/source'] (renamed specifiers)`, async () => {
mockProject([`import { x as y } from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('x');
});
it(`supports [import * as all from 'imported/source'] (namespace specifiers)`, async () => {
mockProject([`import * as all from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('[*]');
});
describe('Reexports', () => {
it(`supports [export { x } from 'imported/source'] (reexported named specifiers)`, async () => {
mockProject([`export { x } from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('x');
});
it(`supports [export { x as y } from 'imported/source'] (reexported renamed specifiers)`, async () => {
mockProject([`export { x as y } from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('x');
});
// maybe in the future... needs experimental babel flag "exportDefaultFrom"
it.skip(`supports [export x from 'imported/source'] (reexported default specifiers)`, async () => {
mockProject([`export x from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('x');
});
it(`supports [export * as x from 'imported/source'] (reexported namespace specifiers)`, async () => {
mockProject([`export * as x from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('[*]');
});
});
// Currently only supported for find-exports. For now not needed...
it.skip(`stores meta info(local name) of renamed specifiers`, async () => {
mockProject([`import { x as y } from 'imported/source'`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
// This info will be relevant later to identify transitive relations
expect(firstEntry.result[0].localMap[0]).to.eql({
local: 'y',
imported: 'x',
});
});
it(`supports [import('my/source')] (dynamic imports)`, async () => {
mockProject([`import('my/source')`]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('[default]');
// TODO: somehow mark as dynamic??
expect(firstEntry.result[0].source).to.equal('my/source');
});
it(`supports [import(pathReference)] (dynamic imports with variable source)`, async () => {
mockProject([
`
const pathReference = 'my/source';
import(pathReference);
`,
]);
await providence(findImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers[0]).to.equal('[default]');
// TODO: somehow mark as dynamic??
expect(firstEntry.result[0].source).to.equal('[variable]');
});
describe('Filter out false positives', () => {
it(`doesn't support [object.import('my/source')] (import method members)`, async () => {
mockProject([`object.import('my/source')`]);
await providence(findImportsQueryConfig, {
targetProjectPaths: ['/fictional/project'], // defined in mockProject
});
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry).to.equal(undefined);
});
});
/**
* Not in scope:
* - dynamic imports containing variables
* - tracking of specifier usage for default (dynamic or not) imports
*/
});
describe('Default post processing', () => {
it('only stores external sources', async () => {
mockProject([
`
import '@external/source';
import 'external/source';
import './internal/source';
import '../internal/source';
import '../../internal/source';
`,
]);
await providence(findImportsQueryConfig, { ..._providenceCfg });
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].source).to.equal('@external/source');
expect(firstEntry.result[1].source).to.equal('external/source');
expect(firstEntry.result[2]).to.equal(undefined);
});
it('normalizes source paths', async () => {
const queryConfig = QueryService.getQueryConfigFromAnalyzer('find-imports', {
keepInternalSources: true,
});
mockProject({
'./internal/file-imports.js': `
import '@external/source';
import 'external/source';
import './source/x'; // auto resolve filename
import '../'; // auto resolve root
`,
'./internal/source/x.js': '',
'./index.js': '',
});
await providence(queryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].normalizedSource).to.equal('@external/source');
// expect(firstEntry.result[0].fullSource).to.equal('@external/source');
expect(firstEntry.result[1].normalizedSource).to.equal('external/source');
// expect(firstEntry.result[1].fullSource).to.equal('external/source');
expect(firstEntry.result[2].normalizedSource).to.equal('./source/x.js');
// expect(firstEntry.result[2].fullSource).to.equal('./internal/source/x.js');
expect(firstEntry.result[3].normalizedSource).to.equal('../index.js');
// expect(firstEntry.result[3].fullSource).to.equal('./index.js');
expect(firstEntry.result[4]).to.equal(undefined);
});
});
describe('Options', () => {
it('"keepInternalSources"', async () => {
const queryConfig = QueryService.getQueryConfigFromAnalyzer('find-imports', {
keepInternalSources: true,
});
mockProject([
`
import '@external/source';
import 'external/source';
import './internal/source';
import '../internal/source';
import '../../internal/source';
`,
]);
await providence(queryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
expect(firstEntry.result[0].importSpecifiers.length).to.equal(1);
expect(firstEntry.result[0].source).to.equal('@external/source');
expect(firstEntry.result[1].source).to.equal('external/source');
expect(firstEntry.result[2].source).to.equal('./internal/source');
expect(firstEntry.result[3].source).to.equal('../internal/source');
expect(firstEntry.result[4].source).to.equal('../../internal/source');
expect(firstEntry.result[5]).to.equal(undefined);
});
// Post processors for whole result
it('"keepOriginalSourceExtensions"', async () => {
const queryConfig = QueryService.getQueryConfigFromAnalyzer('find-imports', {
keepOriginalSourceExtensions: true,
});
mockProject([`import '@external/source.js'`, `import '@external/source';`]);
await providence(queryConfig, _providenceCfg);
const queryResult = queryResults[0];
const firstEntry = getEntry(queryResult);
const secondEntry = getEntry(queryResult, 1);
expect(firstEntry.result[0].normalizedSource).to.equal('@external/source.js');
expect(secondEntry.result[0].normalizedSource).to.equal('@external/source');
});
// TODO: currently disabled. Might become default later (increased readability of json reports)
// but only without loss of information and once depending analyzers (match-imports and
// match-subclasses) are made compatible.
it.skip('"sortBySpecifier"', async () => {
const queryConfig = QueryService.getQueryConfigFromAnalyzer('find-imports', {
sortBySpecifier: true,
});
mockProject(
[
`import { x, y } from '@external/source.js'`,
`import { x, y, z } from '@external/source.js'`,
],
{ filePaths: ['./file1.js', './file2.js'] },
);
await providence(queryConfig, _providenceCfg);
const queryResult = queryResults[0];
/**
* Output will be in the format of:
*
* "queryOutput": [
* {
* "specifier": "LitElement",
* "source": "lion-based-ui/core",
* "id": "LitElement::lion-based-ui/core",
* "dependents": [
* "my-app-using-lion-based-ui/src/x.js",
* "my-app-using-lion-based-ui/src/y/z.js", *
* ...
*/
expect(queryResult.queryOutput[0].specifier).to.equal('x');
// Should be normalized source...?
expect(queryResult.queryOutput[0].source).to.equal('@external/source.js');
expect(queryResult.queryOutput[0].id).to.equal('x::@external/source.js');
expect(queryResult.queryOutput[0].dependents).to.eql([
'fictional-project/file1.js',
'fictional-project/file2.js',
]);
});
});
// TODO: put this in the generic providence/analyzer part
describe.skip('With <script type="module"> inside .html', () => {
it('gets the source from script tags', async () => {});
it('gets the content from script tags', async () => {});
});
});

View file

@ -0,0 +1,261 @@
const { expect } = require('chai');
const { default: traverse } = require('@babel/traverse');
const {
trackDownIdentifier,
trackDownIdentifierFromScope,
} = require('../../../../src/program/analyzers/helpers/track-down-identifier.js');
const { AstService } = require('../../../../src/program/services/AstService.js');
const {
mockProject,
restoreMockedProjects,
} = require('../../../../test-helpers/mock-project-helpers.js');
describe('trackdownIdentifier', () => {
afterEach(() => {
restoreMockedProjects();
});
it(`tracks down identifier to root file (file that declares identifier)`, async () => {
mockProject(
{
'./src/declarationOfMyClass.js': `
export class MyClass extends HTMLElement {}
`,
'./currentFile.js': `
import { MyClass } from './src/declarationOfMyClass';
`,
},
{
projectName: 'my-project',
projectPath: '/my/project',
},
);
// Let's say we want to track down 'MyClass' in the code above
const source = './src/declarationOfMyClass';
const identifierName = 'MyClass';
const currentFilePath = '/my/project/currentFile.js';
const rootPath = '/my/project';
const rootFile = await trackDownIdentifier(source, identifierName, currentFilePath, rootPath);
expect(rootFile).to.eql({
file: './src/declarationOfMyClass.js',
specifier: 'MyClass',
});
});
it(`tracks down transitive and renamed identifiers`, async () => {
mockProject(
{
'./src/declarationOfMyClass.js': `
export class MyClass extends HTMLElement {}
`,
'./src/renamed.js': `
export { MyClass as MyRenamedClass } from './declarationOfMyClass.js';
`,
'./currentFile.js': `
import { MyRenamedClass } from './src/renamed';
`,
},
{
projectName: 'my-project',
projectPath: '/my/project',
},
);
// Let's say we want to track down 'MyClass' in the code above
const source = './src/renamed';
const identifierName = 'MyRenamedClass';
const currentFilePath = '/my/project/currentFile.js';
const rootPath = '/my/project';
const rootFile = await trackDownIdentifier(source, identifierName, currentFilePath, rootPath);
expect(rootFile).to.eql({
file: './src/declarationOfMyClass.js',
specifier: 'MyClass',
});
});
it(`tracks down default identifiers`, async () => {
mockProject(
{
'./src/declarationOfMyClass.js': `
export default class MyClass extends HTMLElement {}
`,
'./src/renamed.js': `
import MyClassDefaultReexport from './declarationOfMyClass.js';
export default MyClassDefaultReexport;
`,
'./currentFile.js': `
import MyClassDefaultImport from './src/renamed';
`,
},
{
projectName: 'my-project',
projectPath: '/my/project',
},
);
// Let's say we want to track down 'MyClass' in the code above
const source = './src/renamed';
const identifierName = '[default]';
const currentFilePath = '/my/project/currentFile.js';
const rootPath = '/my/project';
const rootFile = await trackDownIdentifier(source, identifierName, currentFilePath, rootPath);
expect(rootFile).to.eql({
file: './src/declarationOfMyClass.js',
specifier: '[default]',
});
});
it(`does not track down external sources`, async () => {
mockProject(
{
'./currentFile.js': `
import MyClassDefaultImport from '@external/source';
`,
},
{
projectName: 'my-project',
projectPath: '/my/project',
},
);
// Let's say we want to track down 'MyClass' in the code above
const source = '@external/source';
const identifierName = '[default]';
const currentFilePath = '/my/project/currentFile.js';
const rootPath = '/my/project';
const rootFile = await trackDownIdentifier(source, identifierName, currentFilePath, rootPath);
expect(rootFile).to.eql({
file: '@external/source',
specifier: '[default]',
});
});
// TODO: improve perf
describe.skip('Caching', () => {});
});
describe('trackDownIdentifierFromScope', () => {
it(`gives back [current] if currentFilePath contains declaration`, async () => {
const projectFiles = {
'./src/declarationOfMyClass.js': `
export class MyClass extends HTMLElement {}
`,
};
mockProject(projectFiles, { projectName: 'my-project', projectPath: '/my/project' });
const ast = AstService._getBabelAst(projectFiles['./src/declarationOfMyClass.js']);
// Let's say we want to track down 'MyClass' in the code above
const identifierNameInScope = 'MyClass';
const fullCurrentFilePath = '/my/project//src/declarationOfMyClass.js';
const projectPath = '/my/project';
let astPath;
traverse(ast, {
ClassDeclaration(path) {
astPath = path;
},
});
const rootFile = await trackDownIdentifierFromScope(
astPath,
identifierNameInScope,
fullCurrentFilePath,
projectPath,
);
expect(rootFile).to.eql({
file: '[current]',
specifier: 'MyClass',
});
});
it(`tracks down re-exported identifiers`, async () => {
const projectFiles = {
'./src/declarationOfMyClass.js': `
export class MyClass extends HTMLElement {}
`,
'./re-export.js': `
// Other than with import, no binding is created for MyClass by Babel(?)
// This means 'path.scope.getBinding('MyClass')' returns undefined
// and we have to find a different way to retrieve this value
export { MyClass } from './src/declarationOfMyClass.js';
`,
'./imported.js': `
import { MyClass } from './re-export.js';
`,
};
mockProject(projectFiles, { projectName: 'my-project', projectPath: '/my/project' });
const ast = AstService._getBabelAst(projectFiles['./imported.js']);
// Let's say we want to track down 'MyClass' in the code above
const identifierNameInScope = 'MyClass';
const fullCurrentFilePath = '/my/project/internal.js';
const projectPath = '/my/project';
let astPath;
traverse(ast, {
ImportDeclaration(path) {
astPath = path;
},
});
const rootFile = await trackDownIdentifierFromScope(
astPath,
identifierNameInScope,
fullCurrentFilePath,
projectPath,
);
expect(rootFile).to.eql({
file: './src/declarationOfMyClass.js',
specifier: 'MyClass',
});
});
it(`tracks down extended classes from a reexport`, async () => {
const projectFiles = {
'./src/classes.js': `
export class El1 extends HTMLElement {}
export class El2 extends HTMLElement {}
`,
'./imported.js': `
export { El1, El2 } from './src/classes.js';
export class ExtendedEl1 extends El1 {}
`,
};
mockProject(projectFiles, { projectName: 'my-project', projectPath: '/my/project' });
const ast = AstService._getBabelAst(projectFiles['./imported.js']);
// Let's say we want to track down 'MyClass' in the code above
const identifierNameInScope = 'El1';
const fullCurrentFilePath = '/my/project/internal.js';
const projectPath = '/my/project';
let astPath;
traverse(ast, {
ClassDeclaration(path) {
astPath = path;
},
});
const rootFile = await trackDownIdentifierFromScope(
astPath,
identifierNameInScope,
fullCurrentFilePath,
projectPath,
);
expect(rootFile).to.eql({
file: './src/classes.js',
specifier: 'El1',
});
});
});

View file

@ -0,0 +1,308 @@
const { expect } = require('chai');
const { providence } = require('../../../src/program/providence.js');
const { QueryService } = require('../../../src/program/services/QueryService.js');
const { InputDataService } = require('../../../src/program/services/InputDataService.js');
const {
mockTargetAndReferenceProject,
restoreMockedProjects,
} = require('../../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../test-helpers/mock-log-service-helpers.js');
const matchImportsQueryConfig = QueryService.getQueryConfigFromAnalyzer('match-imports');
const _providenceCfg = {
targetProjectPaths: ['/importing/target/project'],
referenceProjectPaths: ['/exporting/ref/project'],
};
// 1. Reference input data
const referenceProject = {
path: '/exporting/ref/project',
name: 'exporting-ref-project',
files: [
// This file contains all 'original' exported definitions
{
file: './ref-src/core.js',
code: `
// named specifier
export class RefClass extends HTMLElement {};
// default specifier
export default class OtherClass {};
`,
},
// This file is used to test file system 'resolvements' -> importing repos using
// `import 'exporting-ref-project/ref-src/folder'` should be pointed to this index.js file
{
file: './ref-src/folder/index.js',
code: `
// this file (and thus this export) should be resolved via
// [import 'exporting-ref-project/ref-src/folder']
export const resolvePathCorrect = null;
`,
},
{
file: './ref-component.js',
code: `
// global effects
import { RefClass } from './ref-src/core.js';
customElements.define('ref-component', RefClass);
`,
},
{
file: './not-imported.js',
code: `
// this file will not be included by "importing-target-project" defined below
export const notImported = null;
`,
},
// This file re-exports everything from 'ref-src/core.js'
{
file: './index.js',
// Default export, renamed export
// export default class X
code: `
// re-exported specifier
export { RefClass } from './ref-src/core.js';
// renamed re-exported specifier
export { RefClass as RefRenamedClass } from './ref-src/core.js';
// re-exported default specifier
import refConstImported from './ref-src/core.js';
export default refConstImported;
`,
},
{
file: './export-namespaced.js',
code: `
// This file will test if all its exported specifiers are catched via "import * as"
// (namespaced)
export const a = 4;
export default class B {};
`,
},
],
};
const searchTargetProject = {
path: '/importing/target/project',
name: 'importing-target-project',
files: [
{
file: './target-src/indirect-imports.js',
code: `
// named import (indirect, needs transitivity check)
import { RefClass } from 'exporting-ref-project';
// renamed import (indirect, needs transitivity check)
import { RefRenamedClass } from 'exporting-ref-project';
// default (indirect, needs transitivity check)
import refConstImported from 'exporting-ref-project';
// should not be found
import { nonMatched } from 'unknown-project';
`,
},
{
file: './target-src/direct-imports.js',
code: `
// a direct named import
import { RefClass } from 'exporting-ref-project/ref-src/core.js';
// a direct default import
import refConst from 'exporting-ref-project/ref-src/core.js';
// should not be found
import { nonMatched } from 'unknown-project/xyz.js';
/**
* Examples below should be resolved to the proper filepath (filename + extension)
* (direct or indirect is not relevant in this case, it is about the source and not the
* specifier)
*/
// Two things:
// - a file with side effects
// - should resolve "as file", to 'exporting-ref-project/ref-component.js'
import 'exporting-ref-project/ref-component';
// - should resolve "as folder", to 'exporting-ref-project/ref-src/folder/index.js'
import { resolvePathCorrect } from 'exporting-ref-project/ref-src/folder';
`,
},
{
file: './import-namespaced.js',
code: `
// should return a match for every export in reference source
import * as namespace from 'exporting-ref-project/export-namespaced.js';
`,
},
/**
* Possible other checks (although already tested in unit tests of find-import/find-exports):
* - dynamic imports
* - default and named specifiers in one declaration
* - renamed imports
* - ...?
*/
],
};
// 2. Based on the example reference and target projects, we expect the following
// extracted specifiers to be found...
const expectedExportIdsIndirect = [
'RefClass::./index.js::exporting-ref-project',
'RefRenamedClass::./index.js::exporting-ref-project',
'[default]::./index.js::exporting-ref-project',
];
const expectedExportIdsDirect = [
'RefClass::./ref-src/core.js::exporting-ref-project',
'[default]::./ref-src/core.js::exporting-ref-project',
'resolvePathCorrect::./ref-src/folder/index.js::exporting-ref-project',
];
const expectedExportIdsNamespaced = [
'a::./export-namespaced.js::exporting-ref-project',
'[default]::./export-namespaced.js::exporting-ref-project',
];
// eslint-disable-next-line no-unused-vars
const expectedExportIds = [
...expectedExportIdsIndirect,
...expectedExportIdsDirect,
...expectedExportIdsNamespaced,
];
// 3. The AnalyzerResult generated by "match-imports"
// eslint-disable-next-line no-unused-vars
const expectedMatchesOutput = [
{
exportSpecifier: {
name: 'RefClass',
project: 'exporting-ref-project', // name under which it is registered in npm ("name" attr in package.json)
filePath: './ref-src/core.js',
id: 'RefClass::./ref-src/core.js::exporting-ref-project',
},
// All the matched targets (files importing the specifier), ordered per project
matchesPerProject: [
{
project: 'importing-target-project',
files: [
'./target-src/indirect-imports.js',
// ...
],
},
// ...
],
},
];
describe('Analyzer "match-imports"', () => {
const originalReferenceProjectPaths = InputDataService.referenceProjectPaths;
const queryResults = [];
const cacheDisabledInitialValue = QueryService.cacheDisabled;
before(() => {
QueryService.cacheDisabled = true;
suppressNonCriticalLogs();
});
after(() => {
QueryService.cacheDisabled = cacheDisabledInitialValue;
restoreSuppressNonCriticalLogs();
});
beforeEach(() => {
mockWriteToJson(queryResults);
InputDataService.referenceProjectPaths = [];
});
afterEach(() => {
InputDataService.referenceProjectPaths = originalReferenceProjectPaths;
restoreWriteToJson(queryResults);
restoreMockedProjects();
});
describe('Extracting exports', () => {
it(`identifies all direct export specifiers consumed by "importing-target-project"`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expectedExportIdsDirect.forEach(directId => {
expect(
queryResult.queryOutput.find(
exportMatchResult => exportMatchResult.exportSpecifier.id === directId,
),
).not.to.equal(undefined, `id '${directId}' not found`);
});
});
it(`identifies all indirect export specifiers consumed by "importing-target-project"`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expectedExportIdsIndirect.forEach(indirectId => {
expect(
queryResult.queryOutput.find(
exportMatchResult => exportMatchResult.exportSpecifier.id === indirectId,
),
).not.to.equal(undefined, `id '${indirectId}' not found`);
});
});
it(`matches namespaced specifiers consumed by "importing-target-project"`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expectedExportIdsNamespaced.forEach(exportedSpecifierId => {
expect(
queryResult.queryOutput.find(
exportMatchResult => exportMatchResult.exportSpecifier.id === exportedSpecifierId,
),
).not.to.equal(undefined, `id '${exportedSpecifierId}' not found`);
});
});
});
describe('Matching', () => {
it(`produces a list of all matches, sorted by project`, async () => {
function testMatchedEntry(targetExportedId, queryResult, importedByFiles = []) {
const matchedEntry = queryResult.queryOutput.find(
r => r.exportSpecifier.id === targetExportedId,
);
const [name, filePath, project] = targetExportedId.split('::');
expect(matchedEntry.exportSpecifier).to.eql({
name,
filePath,
project,
id: targetExportedId,
});
expect(matchedEntry.matchesPerProject[0].project).to.equal('importing-target-project');
expect(matchedEntry.matchesPerProject[0].files).to.eql(importedByFiles);
}
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchImportsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expectedExportIdsDirect.forEach(targetId => {
testMatchedEntry(targetId, queryResult, ['./target-src/direct-imports.js']);
});
expectedExportIdsIndirect.forEach(targetId => {
testMatchedEntry(targetId, queryResult, ['./target-src/indirect-imports.js']);
});
});
});
});

View file

@ -0,0 +1,691 @@
const { expect } = require('chai');
const { providence } = require('../../../src/program/providence.js');
const { QueryService } = require('../../../src/program/services/QueryService.js');
const { InputDataService } = require('../../../src/program/services/InputDataService.js');
const {
mockTargetAndReferenceProject,
restoreMockedProjects,
} = require('../../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../test-helpers/mock-log-service-helpers.js');
const matchPathsQueryConfig = QueryService.getQueryConfigFromAnalyzer('match-paths');
const _providenceCfg = {
targetProjectPaths: ['/importing/target/project'],
referenceProjectPaths: ['/exporting/ref/project'],
};
describe('Analyzer "match-paths"', () => {
const originalReferenceProjectPaths = InputDataService.referenceProjectPaths;
const queryResults = [];
const cacheDisabledInitialValue = QueryService.cacheDisabled;
before(() => {
QueryService.cacheDisabled = true;
suppressNonCriticalLogs();
});
after(() => {
QueryService.cacheDisabled = cacheDisabledInitialValue;
restoreSuppressNonCriticalLogs();
});
beforeEach(() => {
InputDataService.referenceProjectPaths = [];
mockWriteToJson(queryResults);
});
afterEach(() => {
InputDataService.referenceProjectPaths = originalReferenceProjectPaths;
restoreWriteToJson(queryResults);
restoreMockedProjects();
});
const referenceProject = {
path: '/exporting/ref/project',
name: 'exporting-ref-project',
files: [
{
file: './ref-src/core.js',
code: `
// named specifier
export class RefClass extends HTMLElement {};
// default specifier
export default class OtherClass {};
`,
},
{
file: './reexport.js',
code: `
export { RefClass as RefRenamedClass } from './ref-src/core.js';
// re-exported default specifier
import refConstImported from './ref-src/core.js';
export default refConstImported;
export const Mixin = superclass => class MyMixin extends superclass {}
`,
},
{
file: './importRefClass.js',
code: `
import { RefClass } from './ref-src/core.js';
`,
},
],
};
const searchTargetProject = {
path: '/importing/target/project',
name: 'importing-target-project',
files: [
{
file: './target-src/ExtendRefRenamedClass.js',
code: `
// renamed import (indirect, needs transitivity check)
import { RefRenamedClass } from 'exporting-ref-project/reexport.js';
import defaultExport from 'exporting-ref-project/reexport.js';
/**
* This should result in:
* {
* from: "RefRenamedClass", // should this point to same RefClass? For now, it doesn't
* to: "ExtendRefRenamedClass",
* paths: []
* }
* In other words, it won't end up in the Anlyzer output, because RefRenamedClass
* is nowhere imported internally inside reference project.
*/
export class ExtendRefRenamedClass extends RefRenamedClass {}
`,
},
{
file: './target-src/direct-imports.js',
code: `
// a direct named import
import { RefClass } from 'exporting-ref-project/ref-src/core.js';
// a direct default import
import RefDefault from 'exporting-ref-project/reexport.js';
/**
* This should result in:
* {
* from: "[default]",
* to: "ExtendRefClass",
* paths: [{ from: "./index.js", to: "./target-src/direct-imports.js" }]
* }
*/
export class ExtendRefClass extends RefClass {}
/**
* For result, see './index.js'
*/
export class ExtendRefDefault extends RefDefault {}
`,
},
{
file: './index.js',
code: `
/**
* This should result in:
* {
* from: "[default]",
* to: "ExtendRefDefault",
* paths: [{ from: "./index.js", to: "./index.js" }]
* }
*/
export { ExtendRefDefault } from './target-src/direct-imports.js';
`,
},
],
};
describe('Variables', () => {
const expectedMatches = [
{
name: 'RefRenamedClass',
variable: {
from: 'RefRenamedClass',
to: 'ExtendRefRenamedClass',
paths: [
{
from: './reexport.js',
to: './target-src/ExtendRefRenamedClass.js',
},
{
from: 'exporting-ref-project/reexport.js',
to: './target-src/ExtendRefRenamedClass.js',
},
],
},
},
{
name: '[default]',
variable: {
from: '[default]',
to: 'ExtendRefDefault',
paths: [
{
from: './reexport.js',
to: './index.js',
},
{
from: './ref-src/core.js',
to: './index.js',
},
{
from: 'exporting-ref-project/reexport.js',
to: './index.js',
},
{
from: 'exporting-ref-project/ref-src/core.js',
to: './index.js',
},
],
},
},
{
name: 'RefClass',
variable: {
from: 'RefClass',
to: 'ExtendRefClass',
paths: [
{
from: './ref-src/core.js',
to: './target-src/direct-imports.js',
},
{
from: 'exporting-ref-project/ref-src/core.js',
to: './target-src/direct-imports.js',
},
],
},
},
];
it(`outputs an array result with from/to classes and paths`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput).to.eql(expectedMatches);
});
describe('Features', () => {
const refProj = {
path: '/exporting/ref/project',
name: 'reference-project',
files: [
{
file: './index.js',
code: `
export class RefClass extends HTMLElement {}
`,
},
{
file: './src/importInternally.js',
code: `
import { RefClass } from '../index.js';
`,
},
],
};
const targetProj = {
path: '/importing/target/project',
name: 'importing-target-project',
files: [
{
file: './target-src/TargetClass.js',
// Indirect (via project root) imports
code: `
import { RefClass } from 'reference-project';
export class TargetClass extends RefClass {}
`,
},
],
};
it(`identifies all "from" and "to" classes`, async () => {
mockTargetAndReferenceProject(targetProj, refProj);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].variable.from).to.equal('RefClass');
expect(queryResult.queryOutput[0].variable.to).to.equal('TargetClass');
});
it(`identifies all "from" and "to" paths`, async () => {
mockTargetAndReferenceProject(targetProj, refProj);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].variable.paths[0]).to.eql({
from: './index.js',
to: './target-src/TargetClass.js',
});
});
describe('"to" path of target project', () => {
const targetProjWithMultipleExports = {
...targetProj,
files: [
...targetProj.files,
{
file: './reexportFromRoot.js',
code: `
export { TargetClass } from './target-src/TargetClass.js';
`,
},
],
};
it(`gives back "to" path closest to root`, async () => {
mockTargetAndReferenceProject(targetProjWithMultipleExports, refProj);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].variable.paths[0]).to.eql({
from: './index.js',
to: './reexportFromRoot.js',
});
});
it(`gives back "to" path that matches mainEntry if found`, async () => {
const targetProjWithMultipleExportsAndMainEntry = {
...targetProjWithMultipleExports,
files: [
...targetProjWithMultipleExports.files,
{
file: './target-src/mainEntry.js',
code: `
export { TargetClass } from './TargetClass.js';
`,
},
{
file: './package.json',
code: `{
"name": "${targetProjWithMultipleExports.name}",
"main": "./target-src/mainEntry.js",
"dependencies": {
"reference-project": "1.0.0"
}
}
`,
},
],
};
mockTargetAndReferenceProject(targetProjWithMultipleExportsAndMainEntry, refProj);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].variable.paths[0]).to.eql({
from: './index.js',
to: './target-src/mainEntry.js',
});
});
});
it(`prefixes project paths`, async () => {
mockTargetAndReferenceProject(targetProj, refProj);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
const unprefixedPaths = queryResult.queryOutput[0].variable.paths[0];
expect(unprefixedPaths).to.eql({ from: './index.js', to: './target-src/TargetClass.js' });
expect(queryResult.queryOutput[0].variable.paths[1]).to.eql({
from: `${refProj.name}/${unprefixedPaths.from.slice(2)}`,
to: unprefixedPaths.to,
});
});
it(`allows duplicate reference extensions (like "WolfRadio extends LionRadio" and
"WolfChip extends LionRadio")`, async () => {
const targetProjMultipleTargetExtensions = {
...targetProj,
files: [
{
file: './target-src/TargetSomething.js',
code: `
import { RefClass } from 'reference-project';
// in this case, we have TargetClass and TargetSomething => Class wins,
// because of its name resemblance to RefClass
export class TargetSomething extends RefClass {}
`,
},
...targetProj.files,
],
};
mockTargetAndReferenceProject(targetProjMultipleTargetExtensions, refProj);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].variable.paths[0]).to.eql({
from: './index.js',
to: './target-src/TargetClass.js',
});
expect(queryResult.queryOutput[1].variable.paths[0]).to.eql({
from: './index.js',
to: './target-src/TargetSomething.js',
});
});
});
describe('Options', () => {
const refProj = {
path: '/exporting/ref/project',
name: 'reference-project',
files: [
{
file: './index.js',
code: `
export class RefClass extends HTMLElement {}
`,
},
{
file: './src/importInternally.js',
code: `
import { RefClass } from '../index.js';
`,
},
],
};
const targetProj = {
path: '/importing/target/project',
name: 'importing-target-project',
files: [
{
file: './target-src/TargetClass.js',
// Indirect (via project root) imports
code: `
import { RefClass } from 'reference-project';
export class TargetClass extends RefClass {}
`,
},
],
};
it(`filters out duplicates based on prefixes (so "WolfRadio extends LionRadio"
is kept, "WolfChip extends LionRadio" is removed)`, async () => {
const targetProjMultipleTargetExtensions = {
...targetProj,
files: [
{
file: './target-src/TargetSomething.js',
code: `
import { RefClass } from 'reference-project';
// in this case, we have TargetClass and TargetSomething => Class wins,
// because of its name resemblance to RefClass
export class TargetSomething extends RefClass {}
`,
},
...targetProj.files,
],
};
mockTargetAndReferenceProject(targetProjMultipleTargetExtensions, refProj);
const matchPathsQueryConfigFilter = QueryService.getQueryConfigFromAnalyzer('match-paths', {
prefix: { from: 'ref', to: 'target' },
});
await providence(matchPathsQueryConfigFilter, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].variable.paths[0]).to.eql({
from: './index.js',
to: './target-src/TargetClass.js',
});
expect(queryResult.queryOutput[1]).to.equal(undefined);
});
});
});
describe('Tags', () => {
// eslint-disable-next-line no-shadow
const referenceProject = {
path: '/exporting/ref/project',
name: 'exporting-ref-project',
files: [
{
file: './customelementDefinitions.js',
code: `
// => need to be replaced to imports of target project
export { El1, El2 } from './classDefinitions.js';
customElements.define('el-1', El1);
customElements.define('el-2', El2);
`,
},
{
file: './classDefinitions.js',
code: `
export class El1 extends HTMLElement {};
export class El2 extends HTMLElement {};
`,
},
{
file: './import.js',
code: `
import './customelementDefinitions.js';
`,
},
],
};
// eslint-disable-next-line no-shadow
const searchTargetProject = {
path: '/importing/target/project',
name: 'importing-target-project',
files: [
{
file: './extendedCustomelementDefinitions.js',
code: `
import { ExtendedEl1 } from './extendedClassDefinitions.js';
import { ExtendedEl2 } from './reexportedExtendedClassDefinitions.js';
customElements.define('extended-el-1', ExtendedEl1);
customElements.define('extended-el-2', ExtendedEl2);
`,
},
{
file: './extendedClassDefinitions.js',
code: `
export { El1, El2 } from 'exporting-ref-project/classDefinitions.js';
export class ExtendedEl1 extends El1 {}
`,
},
{
file: './reexportedExtendedClassDefinitions.js',
code: `
export { El2 } from './extendedClassDefinitions.js';
export class ExtendedEl2 extends El2 {}
`,
},
],
};
// 2. Extracted specifiers (by find-exports analyzer)
const expectedMatches = [
{
from: 'el-1',
to: 'extended-el-1',
paths: [
{ from: './customelementDefinitions.js', to: './extendedCustomelementDefinitions.js' },
{
from: 'exporting-ref-project/customelementDefinitions.js',
to: './extendedCustomelementDefinitions.js',
},
],
},
{
from: 'el-2',
to: 'extended-el-2',
paths: [
{ from: './customelementDefinitions.js', to: './extendedCustomelementDefinitions.js' },
{
from: 'exporting-ref-project/customelementDefinitions.js',
to: './extendedCustomelementDefinitions.js',
},
],
},
];
it(`outputs an array result with from/to tag names and paths`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].tag).to.eql(expectedMatches[0]);
expect(queryResult.queryOutput[1].tag).to.eql(expectedMatches[1]);
});
describe('Features', () => {
it(`identifies all "from" and "to" tagnames`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].tag.from).to.equal('el-1');
expect(queryResult.queryOutput[0].tag.to).to.equal('extended-el-1');
});
it(`identifies all "from" and "to" paths`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].tag.paths[0]).to.eql({
from: './customelementDefinitions.js',
to: './extendedCustomelementDefinitions.js',
});
});
it(`prefixes project paths`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput[0].tag.paths[1]).to.eql({
from: 'exporting-ref-project/customelementDefinitions.js',
to: './extendedCustomelementDefinitions.js',
});
});
});
});
describe('Full structure', () => {
const referenceProjectFull = {
...referenceProject,
files: [
{
file: './tag.js',
code: `
import { RefClass } from './ref-src/core.js';
customElements.define('ref-class', RefClass);
`,
},
...referenceProject.files,
],
};
const searchTargetProjectFull = {
...searchTargetProject,
files: [
{
file: './tag-extended.js',
code: `
import { ExtendRefClass } from './target-src/direct-imports';
customElements.define('tag-extended', ExtendRefClass);
`,
},
...searchTargetProject.files,
],
};
const expectedMatchesFull = [
{
name: 'RefRenamedClass',
variable: {
from: 'RefRenamedClass',
to: 'ExtendRefRenamedClass',
paths: [
{
from: './reexport.js',
to: './target-src/ExtendRefRenamedClass.js',
},
{
from: 'exporting-ref-project/reexport.js',
to: './target-src/ExtendRefRenamedClass.js',
},
],
},
},
{
name: '[default]',
variable: {
from: '[default]',
to: 'ExtendRefDefault',
paths: [
{
from: './reexport.js',
to: './index.js',
},
{
from: './ref-src/core.js',
to: './index.js',
},
{
from: 'exporting-ref-project/reexport.js',
to: './index.js',
},
{
from: 'exporting-ref-project/ref-src/core.js',
to: './index.js',
},
],
},
},
{
name: 'RefClass',
variable: {
from: 'RefClass',
to: 'ExtendRefClass',
paths: [
{
from: './ref-src/core.js',
to: './target-src/direct-imports.js',
},
{
from: 'exporting-ref-project/ref-src/core.js',
to: './target-src/direct-imports.js',
},
],
},
tag: {
from: 'ref-class',
to: 'tag-extended',
paths: [
{
from: './tag.js',
to: './tag-extended.js',
},
{
from: 'exporting-ref-project/tag.js',
to: './tag-extended.js',
},
],
},
},
];
it(`outputs a "name", "variable" and "tag" entry`, async () => {
mockTargetAndReferenceProject(searchTargetProjectFull, referenceProjectFull);
await providence(matchPathsQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expect(queryResult.queryOutput).to.eql(expectedMatchesFull);
});
});
});

View file

@ -0,0 +1,235 @@
const { expect } = require('chai');
const { providence } = require('../../../src/program/providence.js');
const { QueryService } = require('../../../src/program/services/QueryService.js');
const { InputDataService } = require('../../../src/program/services/InputDataService.js');
const {
mockTargetAndReferenceProject,
restoreMockedProjects,
} = require('../../../test-helpers/mock-project-helpers.js');
const {
mockWriteToJson,
restoreWriteToJson,
} = require('../../../test-helpers/mock-report-service-helpers.js');
const {
suppressNonCriticalLogs,
restoreSuppressNonCriticalLogs,
} = require('../../../test-helpers/mock-log-service-helpers.js');
const matchSubclassesQueryConfig = QueryService.getQueryConfigFromAnalyzer('match-subclasses');
const _providenceCfg = {
targetProjectPaths: ['/importing/target/project'],
referenceProjectPaths: ['/exporting/ref/project'],
};
// 1. Reference input data
const referenceProject = {
path: '/exporting/ref/project',
name: 'exporting-ref-project',
files: [
// This file contains all 'original' exported definitions
{
file: './ref-src/core.js',
code: `
// named specifier
export class RefClass extends HTMLElement {};
// default specifier
export default class OtherClass {};
`,
},
// This file is used to test file system 'resolvements' -> importing repos using
// `import 'exporting-ref-project/ref-src/folder'` should be pointed to this index.js file
{
file: './index.js',
code: `
export { RefClass as RefRenamedClass } from './ref-src/core.js';
// re-exported default specifier
import refConstImported from './ref-src/core.js';
export default refConstImported;
export const Mixin = superclass => class MyMixin extends superclass {}
`,
},
],
};
const searchTargetProject = {
path: '/importing/target/project',
name: 'importing-target-project',
files: [
{
file: './target-src/indirect-imports.js',
// Indirect (via project root) imports
code: `
// renamed import (indirect, needs transitivity check)
import { RefRenamedClass } from 'exporting-ref-project';
import defaultExport from 'exporting-ref-project';
class ExtendRefRenamedClass extends RefRenamedClass {}
`,
},
{
file: './target-src/direct-imports.js',
code: `
// a direct named import
import { RefClass } from 'exporting-ref-project/ref-src/core.js';
// a direct default import
import RefDefault from 'exporting-ref-project';
// a direct named mixin
import { Mixin } from 'exporting-ref-project';
// Non match
import { ForeignMixin } from 'unknow-project';
class ExtendRefClass extends RefClass {}
class ExtendRefDefault extends RefDefault {}
class ExtendRefClassWithMixin extends ForeignMixin(Mixin(RefClass)) {}
`,
},
],
};
// 2. Extracted specifiers (by find-exports analyzer)
const expectedExportIdsIndirect = ['RefRenamedClass::./index.js::exporting-ref-project'];
const expectedExportIdsDirect = [
// ids should be unique across multiple projects
// Not in scope: version number of a project.
'RefClass::./ref-src/core.js::exporting-ref-project',
'[default]::./index.js::exporting-ref-project',
'Mixin::./index.js::exporting-ref-project',
];
// eslint-disable-next-line no-unused-vars
const expectedExportIds = [...expectedExportIdsIndirect, ...expectedExportIdsDirect];
// 3. The AnalyzerResult generated by "match-subclasses"
// eslint-disable-next-line no-unused-vars
const expectedMatchesOutput = [
{
exportSpecifier: {
name: 'RefClass',
// name under which it is registered in npm ("name" attr in package.json)
project: 'exporting-ref-project',
filePath: './ref-src/core.js',
id: 'RefClass::./ref-src/core.js::exporting-ref-project',
// TODO: next step => identify transitive relations and add inside
// most likely via post processor
},
// All the matched targets (files importing the specifier), ordered per project
matchesPerProject: [
{
project: 'importing-target-project',
files: [
{ file: './target-src/indirect-imports.js', identifier: 'ExtendedRefClass' },
// ...
],
},
// ...
],
},
];
// eslint-disable-next-line no-shadow
describe('Analyzer "match-subclasses"', () => {
const originalReferenceProjectPaths = InputDataService.referenceProjectPaths;
const queryResults = [];
const cacheDisabledInitialValue = QueryService.cacheDisabled;
before(() => {
QueryService.cacheDisabled = true;
suppressNonCriticalLogs();
});
after(() => {
QueryService.cacheDisabled = cacheDisabledInitialValue;
restoreSuppressNonCriticalLogs();
});
beforeEach(() => {
InputDataService.referenceProjectPaths = [];
mockWriteToJson(queryResults);
});
afterEach(() => {
InputDataService.referenceProjectPaths = originalReferenceProjectPaths;
restoreWriteToJson(queryResults);
restoreMockedProjects();
});
describe('Extracting exports', () => {
it(`identifies all indirect export specifiers consumed by "importing-target-project"`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchSubclassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expectedExportIdsIndirect.forEach(indirectId => {
expect(
queryResult.queryOutput.find(
exportMatchResult => exportMatchResult.exportSpecifier.id === indirectId,
),
).not.to.equal(undefined, `id '${indirectId}' not found`);
});
});
it(`identifies all direct export specifiers consumed by "importing-target-project"`, async () => {
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchSubclassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expectedExportIdsDirect.forEach(directId => {
expect(
queryResult.queryOutput.find(
exportMatchResult => exportMatchResult.exportSpecifier.id === directId,
),
).not.to.equal(undefined, `id '${directId}' not found`);
});
});
});
describe('Matching', () => {
// TODO: because we intoduced an object in match-classes, we find duplicate entries in
// our result set cretaed in macth-subclasses. Fix there...
it.skip(`produces a list of all matches, sorted by project`, async () => {
function testMatchedEntry(targetExportedId, queryResult, importedByFiles = []) {
const matchedEntry = queryResult.queryOutput.find(
r => r.exportSpecifier.id === targetExportedId,
);
const [name, filePath, project] = targetExportedId.split('::');
expect(matchedEntry.exportSpecifier).to.eql({
name,
filePath,
project,
id: targetExportedId,
});
expect(matchedEntry.matchesPerProject[0].project).to.equal('importing-target-project');
expect(matchedEntry.matchesPerProject[0].files).to.eql(importedByFiles);
}
mockTargetAndReferenceProject(searchTargetProject, referenceProject);
await providence(matchSubclassesQueryConfig, _providenceCfg);
const queryResult = queryResults[0];
expectedExportIdsDirect.forEach(targetId => {
testMatchedEntry(targetId, queryResult, [
// TODO: 'identifier' needs to be the exported name of extending class
{
identifier: targetId.split('::')[0],
file: './target-src/direct-imports.js',
memberOverrides: undefined,
},
]);
});
expectedExportIdsIndirect.forEach(targetId => {
testMatchedEntry(targetId, queryResult, [
// TODO: 'identifier' needs to be the exported name of extending class
{ identifier: targetId.split('::')[0], file: './target-src/indirect-imports.js' },
]);
});
});
});
});

Some files were not shown because too many files have changed in this diff Show more