Merge remote-tracking branch 'code/master' into feature/merge-code

This commit is contained in:
Fuyao Zhao 2018-12-06 14:34:55 -08:00
commit 9eda4c57bc
323 changed files with 32829 additions and 0 deletions

10
.gitmodules vendored Normal file
View file

@ -0,0 +1,10 @@
[submodule "javascript-typescript-langserver"]
path = code/lsp/javascript-typescript-langserver
url = ../javascript-typescript-langserver
ignore = untracked
[submodule "kibana"]
path = code/kibana
url = ../kibana
[submodule "lsp/java-langserver"]
path = code/lsp/java-langserver
url = ../java-langserver

16
code/.gitignore vendored Normal file
View file

@ -0,0 +1,16 @@
gen/
bin/
!/kibana-extra/code/packages/code-filename-check/bin/
.DS_Store
.gradle/
.idea/
.cache/
out/
.settings/
build
*.iws
*.iml
*.ipr
/data/
.project
.classpath

View file

@ -0,0 +1,3 @@
{
"styleSheetToCompile": "public/styles.scss"
}

6
code/.vscode/settings.json vendored Normal file
View file

@ -0,0 +1,6 @@
{
"git.ignoreLimitWarning": true,
"editor.tabSize": 2,
"editor.rulers": [110],
"files.trimTrailingWhitespace": true
}

0
code/CHANGES Normal file
View file

41
code/CONTRIBUTING.md Normal file
View file

@ -0,0 +1,41 @@
# Setting Up Your Development Environment
> You'll need to have a `java` binary in `PATH` or set `JAVA_HOME` to use the gradlew scripts.
> On windows use `./gradlew.bat` anywhere you see `./gradlew`
Fork, then clone the `code` repo and move into it
```bash
git clone https://github.com/[YOUR_USERNAME]/castro.git code
cd code
```
Bootstrap the castro repo and pull a local checkout of the Kibana repo by running the bootstrap gradle task
```bash
./gradlew bootstrap
```
Move into the Kibana checkout and start elasticsearch from a nightly snapshot.
```bash
./gradlew startDeps
```
Start Kibana with code
```bash
./gradlew startKibana
```
In order to develop code intelligence feature, you need to checkout submodule:
```bash
./scripts/update
```
then run
```bash
./gradlew lsp:javascript:build
```

223
code/LICENSE.txt Normal file
View file

@ -0,0 +1,223 @@
ELASTIC LICENSE AGREEMENT
PLEASE READ CAREFULLY THIS ELASTIC LICENSE AGREEMENT (THIS "AGREEMENT"), WHICH
CONSTITUTES A LEGALLY BINDING AGREEMENT AND GOVERNS ALL OF YOUR USE OF ALL OF
THE ELASTIC SOFTWARE WITH WHICH THIS AGREEMENT IS INCLUDED ("ELASTIC SOFTWARE")
THAT IS PROVIDED IN OBJECT CODE FORMAT, AND, IN ACCORDANCE WITH SECTION 2 BELOW,
CERTAIN OF THE ELASTIC SOFTWARE THAT IS PROVIDED IN SOURCE CODE FORMAT. BY
INSTALLING OR USING ANY OF THE ELASTIC SOFTWARE GOVERNED BY THIS AGREEMENT, YOU
ARE ASSENTING TO THE TERMS AND CONDITIONS OF THIS AGREEMENT. IF YOU DO NOT AGREE
WITH SUCH TERMS AND CONDITIONS, YOU MAY NOT INSTALL OR USE THE ELASTIC SOFTWARE
GOVERNED BY THIS AGREEMENT. IF YOU ARE INSTALLING OR USING THE SOFTWARE ON
BEHALF OF A LEGAL ENTITY, YOU REPRESENT AND WARRANT THAT YOU HAVE THE ACTUAL
AUTHORITY TO AGREE TO THE TERMS AND CONDITIONS OF THIS AGREEMENT ON BEHALF OF
SUCH ENTITY.
Posted Date: April 20, 2018
This Agreement is entered into by and between Elasticsearch BV ("Elastic") and
You, or the legal entity on behalf of whom You are acting (as applicable,
"You").
1. OBJECT CODE END USER LICENSES, RESTRICTIONS AND THIRD PARTY OPEN SOURCE
SOFTWARE
1.1 Object Code End User License. Subject to the terms and conditions of
Section 1.2 of this Agreement, Elastic hereby grants to You, AT NO CHARGE and
for so long as you are not in breach of any provision of this Agreement, a
License to the Basic Features and Functions of the Elastic Software.
1.2 Reservation of Rights; Restrictions. As between Elastic and You, Elastic
and its licensors own all right, title and interest in and to the Elastic
Software, and except as expressly set forth in Sections 1.1, and 2.1 of this
Agreement, no other license to the Elastic Software is granted to You under
this Agreement, by implication, estoppel or otherwise. You agree not to: (i)
reverse engineer or decompile, decrypt, disassemble or otherwise reduce any
Elastic Software provided to You in Object Code, or any portion thereof, to
Source Code, except and only to the extent any such restriction is prohibited
by applicable law, (ii) except as expressly permitted in this Agreement,
prepare derivative works from, modify, copy or use the Elastic Software Object
Code or the Commercial Software Source Code in any manner; (iii) except as
expressly permitted in Section 1.1 above, transfer, sell, rent, lease,
distribute, sublicense, loan or otherwise transfer, Elastic Software Object
Code, in whole or in part, to any third party; (iv) use Elastic Software
Object Code for providing time-sharing services, any software-as-a-service,
service bureau services or as part of an application services provider or
other service offering (collectively, "SaaS Offering") where obtaining access
to the Elastic Software or the features and functions of the Elastic Software
is a primary reason or substantial motivation for users of the SaaS Offering
to access and/or use the SaaS Offering ("Prohibited SaaS Offering"); (v)
circumvent the limitations on use of Elastic Software provided to You in
Object Code format that are imposed or preserved by any License Key, or (vi)
alter or remove any Marks and Notices in the Elastic Software. If You have any
question as to whether a specific SaaS Offering constitutes a Prohibited SaaS
Offering, or are interested in obtaining Elastic's permission to engage in
commercial or non-commercial distribution of the Elastic Software, please
contact elastic_license@elastic.co.
1.3 Third Party Open Source Software. The Commercial Software may contain or
be provided with third party open source libraries, components, utilities and
other open source software (collectively, "Open Source Software"), which Open
Source Software may have applicable license terms as identified on a website
designated by Elastic. Notwithstanding anything to the contrary herein, use of
the Open Source Software shall be subject to the license terms and conditions
applicable to such Open Source Software, to the extent required by the
applicable licensor (which terms shall not restrict the license rights granted
to You hereunder, but may contain additional rights). To the extent any
condition of this Agreement conflicts with any license to the Open Source
Software, the Open Source Software license will govern with respect to such
Open Source Software only. Elastic may also separately provide you with
certain open source software that is licensed by Elastic. Your use of such
Elastic open source software will not be governed by this Agreement, but by
the applicable open source license terms.
2. COMMERCIAL SOFTWARE SOURCE CODE
2.1 Limited License. Subject to the terms and conditions of Section 2.2 of
this Agreement, Elastic hereby grants to You, AT NO CHARGE and for so long as
you are not in breach of any provision of this Agreement, a limited,
non-exclusive, non-transferable, fully paid up royalty free right and license
to the Commercial Software in Source Code format, without the right to grant
or authorize sublicenses, to prepare Derivative Works of the Commercial
Software, provided You (i) do not hack the licensing mechanism, or otherwise
circumvent the intended limitations on the use of Elastic Software to enable
features other than Basic Features and Functions or those features You are
entitled to as part of a Subscription, and (ii) use the resulting object code
only for reasonable testing purposes.
2.2 Restrictions. Nothing in Section 2.1 grants You the right to (i) use the
Commercial Software Source Code other than in accordance with Section 2.1
above, (ii) use a Derivative Work of the Commercial Software outside of a
Non-production Environment, in any production capacity, on a temporary or
permanent basis, or (iii) transfer, sell, rent, lease, distribute, sublicense,
loan or otherwise make available the Commercial Software Source Code, in whole
or in part, to any third party. Notwithstanding the foregoing, You may
maintain a copy of the repository in which the Source Code of the Commercial
Software resides and that copy may be publicly accessible, provided that you
include this Agreement with Your copy of the repository.
3. TERMINATION
3.1 Termination. This Agreement will automatically terminate, whether or not
You receive notice of such Termination from Elastic, if You breach any of its
provisions.
3.2 Post Termination. Upon any termination of this Agreement, for any reason,
You shall promptly cease the use of the Elastic Software in Object Code format
and cease use of the Commercial Software in Source Code format. For the
avoidance of doubt, termination of this Agreement will not affect Your right
to use Elastic Software, in either Object Code or Source Code formats, made
available under the Apache License Version 2.0.
3.3 Survival. Sections 1.2, 2.2. 3.3, 4 and 5 shall survive any termination or
expiration of this Agreement.
4. DISCLAIMER OF WARRANTIES AND LIMITATION OF LIABILITY
4.1 Disclaimer of Warranties. TO THE MAXIMUM EXTENT PERMITTED UNDER APPLICABLE
LAW, THE ELASTIC SOFTWARE IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND,
AND ELASTIC AND ITS LICENSORS MAKE NO WARRANTIES WHETHER EXPRESSED, IMPLIED OR
STATUTORY REGARDING OR RELATING TO THE ELASTIC SOFTWARE. TO THE MAXIMUM EXTENT
PERMITTED UNDER APPLICABLE LAW, ELASTIC AND ITS LICENSORS SPECIFICALLY
DISCLAIM ALL IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
PURPOSE AND NON-INFRINGEMENT WITH RESPECT TO THE ELASTIC SOFTWARE, AND WITH
RESPECT TO THE USE OF THE FOREGOING. FURTHER, ELASTIC DOES NOT WARRANT RESULTS
OF USE OR THAT THE ELASTIC SOFTWARE WILL BE ERROR FREE OR THAT THE USE OF THE
ELASTIC SOFTWARE WILL BE UNINTERRUPTED.
4.2 Limitation of Liability. IN NO EVENT SHALL ELASTIC OR ITS LICENSORS BE
LIABLE TO YOU OR ANY THIRD PARTY FOR ANY DIRECT OR INDIRECT DAMAGES,
INCLUDING, WITHOUT LIMITATION, FOR ANY LOSS OF PROFITS, LOSS OF USE, BUSINESS
INTERRUPTION, LOSS OF DATA, COST OF SUBSTITUTE GOODS OR SERVICES, OR FOR ANY
SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES OF ANY KIND, IN CONNECTION WITH
OR ARISING OUT OF THE USE OR INABILITY TO USE THE ELASTIC SOFTWARE, OR THE
PERFORMANCE OF OR FAILURE TO PERFORM THIS AGREEMENT, WHETHER ALLEGED AS A
BREACH OF CONTRACT OR TORTIOUS CONDUCT, INCLUDING NEGLIGENCE, EVEN IF ELASTIC
HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
5. MISCELLANEOUS
This Agreement completely and exclusively states the entire agreement of the
parties regarding the subject matter herein, and it supersedes, and its terms
govern, all prior proposals, agreements, or other communications between the
parties, oral or written, regarding such subject matter. This Agreement may be
modified by Elastic from time to time, and any such modifications will be
effective upon the "Posted Date" set forth at the top of the modified
Agreement. If any provision hereof is held unenforceable, this Agreement will
continue without said provision and be interpreted to reflect the original
intent of the parties. This Agreement and any non-contractual obligation
arising out of or in connection with it, is governed exclusively by Dutch law.
This Agreement shall not be governed by the 1980 UN Convention on Contracts
for the International Sale of Goods. All disputes arising out of or in
connection with this Agreement, including its existence and validity, shall be
resolved by the courts with jurisdiction in Amsterdam, The Netherlands, except
where mandatory law provides for the courts at another location in The
Netherlands to have jurisdiction. The parties hereby irrevocably waive any and
all claims and defenses either might otherwise have in any such action or
proceeding in any of such courts based upon any alleged lack of personal
jurisdiction, improper venue, forum non conveniens or any similar claim or
defense. A breach or threatened breach, by You of Section 2 may cause
irreparable harm for which damages at law may not provide adequate relief, and
therefore Elastic shall be entitled to seek injunctive relief without being
required to post a bond. You may not assign this Agreement (including by
operation of law in connection with a merger or acquisition), in whole or in
part to any third party without the prior written consent of Elastic, which
may be withheld or granted by Elastic in its sole and absolute discretion.
Any assignment in violation of the preceding sentence is void. Notices to
Elastic may also be sent to legal@elastic.co.
6. DEFINITIONS
The following terms have the meanings ascribed:
6.1 "Affiliate" means, with respect to a party, any entity that controls, is
controlled by, or which is under common control with, such party, where
"control" means ownership of at least fifty percent (50%) of the outstanding
voting shares of the entity, or the contractual right to establish policy for,
and manage the operations of, the entity.
6.2 "Basic Features and Functions" means those features and functions of the
Elastic Software that are eligible for use under a Basic license, as set forth
at https://www.elastic.co/subscriptions, as may be modified by Elastic from
time to time.
6.3 "Commercial Software" means the Elastic Software Source Code in any file
containing a header stating the contents are subject to the Elastic License or
which is contained in the repository folder labeled "x-pack", unless a LICENSE
file present in the directory subtree declares a different license.
6.4 "Derivative Work of the Commercial Software" means, for purposes of this
Agreement, any modification(s) or enhancement(s) to the Commercial Software,
which represent, as a whole, an original work of authorship.
6.5 "License" means a limited, non-exclusive, non-transferable, fully paid up,
royalty free, right and license, without the right to grant or authorize
sublicenses, solely for Your internal business operations to (i) install and
use the applicable Features and Functions of the Elastic Software in Object
Code, and (ii) permit Contractors and Your Affiliates to use the Elastic
software as set forth in (i) above, provided that such use by Contractors must
be solely for Your benefit and/or the benefit of Your Affiliates, and You
shall be responsible for all acts and omissions of such Contractors and
Affiliates in connection with their use of the Elastic software that are
contrary to the terms and conditions of this Agreement.
6.6 "License Key" means a sequence of bytes, including but not limited to a
JSON blob, that is used to enable certain features and functions of the
Elastic Software.
6.7 "Marks and Notices" means all Elastic trademarks, trade names, logos and
notices present on the Documentation as originally provided by Elastic.
6.8 "Non-production Environment" means an environment for development, testing
or quality assurance, where software is not used for production purposes.
6.9 "Object Code" means any form resulting from mechanical transformation or
translation of Source Code form, including but not limited to compiled object
code, generated documentation, and conversions to other media types.
6.10 "Source Code" means the preferred form of computer software for making
modifications, including but not limited to software source code,
documentation source, and configuration files.
6.11 "Subscription" means the right to receive Support Services and a License
to the Commercial Software.

69
code/README.md Normal file
View file

@ -0,0 +1,69 @@
# Codesearch
## Source
The source for the CodeSearch plugin can be found in `kibana-extras/codesearch`. This is in the correct location to use a local checkout of the Kibana repository that is created in `kibana` the first time you run `yarn kbn bootstrap`
## Environment
Install Node version 8.11.4 and yarn version 1.10.1. If you want to quickly start the stack with tmux script, you need to install tmux and tmuxp
```$bash
brew install tmux
pip install tmuxp
```
## Development
See the [contributing guide](./CONTRIBUTING.md) for instructions setting up your development environment. Once you have completed that, use the following scripts.
- `./scripts/update_submodule`
Initialize and clone submodules like kibana, language servers, etc
All following commands need to be run under `kibana-extra/codesearch`:
- `yarn kbn bootstrap`
Install dependencies in Kibana and codesearch.
- `yarn start-deps`
Start an elasticsearch instance using a nightly snapshot.
- `yarn start`
Start kibana and have it include the codesearch plugin. After this is started you should be able to visit kibana interface at http://localhost:5601
- `yarn tslint`
Lint the sourcecode with [`tslint`](https://github.com/palantir/tslint).
- `yarn tslint --fix`
Lint the sourcecode with [`tslint`](https://github.com/palantir/tslint) and fix any auto-fixable errors.
- `yarn type-check`
Check types in the source code with the TypeScript compiler.
- `yarn type-check --watch`
Check types in the source code with the TypeScript compiler once initially and again whenever a source file changes.
You could bring up the stack and have it run in background without worry about get process killed: after bootstraping, just run
```
./scripts/tmux_session
```
Note that language servers need to be built separately:
- Typescript: `cd lsp/javascript-typescript-langserver; yarn run build` or `yarn watch` for continuous build
- Java: `cd lsp/java-langserver; ./mvnw package`
To start production environment
- `NODE_ENV=production node $NODE_OPTIONS --no-warnings src/cli --plugin-path ../kibana-extra/codesearch --config ../config/kibana/kibana.yml`
## License
All files in this repository are subject to the Elastic License. See [`LICENSE.txt`](./LICENSE.txt) for details.

43
code/build.gradle Normal file
View file

@ -0,0 +1,43 @@
plugins {
id 'com.gradle.build-scan' version '1.15.1'
}
allprojects {
repositories {
mavenCentral()
}
apply {
plugin "idea"
}
}
buildScan {
licenseAgreementUrl = 'https://gradle.com/terms-of-service'
licenseAgree = 'yes'
if (System.getenv('CI')) {
publishAlways()
tag 'CI'
}
}
task checkoutSubmodule(type: Exec) {
commandLine "bash", "-c", "git submodule update --init --remote --rebase"
}
//task startDeps(type: Exec) {
// commandLine "docker-compose", "up", "-d"
//}
//
//task stopDeps(type: Exec) {
// commandLine "docker-compose", "down"
//}
idea {
module {
excludeDirs += [file("kibana")]
}
}
wrapper {
distributionUrl = "https://services.gradle.org/distributions/gradle-4.10-all.zip"
}

View file

@ -0,0 +1,18 @@
metricbeat.modules:
- module: system
metricsets:
- cpu
- filesystem
- memory
- network
- process
enabled: true
period: 10s
processes: ['.*']
cpu_ticks: false
output.elasticsearch:
hosts: [ "docker.for.mac.localhost:9201" ]
setup.kibana:
host: "docker.for.mac.localhost:5601"

View file

@ -0,0 +1,18 @@
optimize:
sourceMaps: false
# sourceMaps: '#cheap-source-map'
code:
redirectToNode: http://localhost:5601/{baseUrl}
enableGlobalReference: false
# repos:
# - repo: 'github.com/Microsoft/TypeScript-Node-Starter'
# init:
# - 'yarn'
# - 'install'
xpack.security.encryptionKey: "something_at_least_32_characters"
xpack.reporting.encryptionKey: "something_at_least_32_characters"
elasticsearch.username: "elastic"
elasticsearch.password: "changeme"

View file

@ -0,0 +1,15 @@
optimize:
unsafeCache: true
sourceMaps: false
# sourceMaps: '#cheap-source-map'
code:
enableGlobalReference: false
# repos:
# - repo: 'github.com/Microsoft/TypeScript-Node-Starter'
# init:
# - 'yarn'
# - 'install'
xpack.security.encryptionKey: "something_at_least_32_characters"
xpack.reporting.encryptionKey: "something_at_least_32_characters"

View file

@ -0,0 +1,2 @@
node:
name: "dev"

View file

@ -0,0 +1,15 @@
input {
tcp {
port => 5000
codec => "json"
}
}
output {
elasticsearch {
hosts => [ "elasticsearch:9200" ]
}
stdout {
codec => json
}
}

4
code/gradle.properties Normal file
View file

@ -0,0 +1,4 @@
kibanaVersion = 991e805669d69c9df25709e1ab4195c4ae08bb56
nodeVersion = 8.14.0
yarnVersion = 1.12.3

BIN
code/gradle/wrapper/gradle-wrapper.jar vendored Normal file

Binary file not shown.

View file

@ -0,0 +1,5 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-4.10-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

172
code/gradlew vendored Executable file
View file

@ -0,0 +1,172 @@
#!/usr/bin/env sh
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn () {
echo "$*"
}
die () {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Escape application args
save () {
for i do printf %s\\n "$i" | sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/' \\\\/" ; done
echo " "
}
APP_ARGS=$(save "$@")
# Collect all arguments for the java command, following the shell quoting and substitution rules
eval set -- $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS "\"-Dorg.gradle.appname=$APP_BASE_NAME\"" -classpath "\"$CLASSPATH\"" org.gradle.wrapper.GradleWrapperMain "$APP_ARGS"
# by default we should be in the correct project dir, but when run from Finder on Mac, the cwd is wrong
if [ "$(uname)" = "Darwin" ] && [ "$HOME" = "$PWD" ]; then
cd "$(dirname "$0")"
fi
exec "$JAVACMD" "$@"

84
code/gradlew.bat vendored Normal file
View file

@ -0,0 +1,84 @@
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

1
code/kibana Submodule

@ -0,0 +1 @@
Subproject commit a64afc05590e944a042cd12fe572b793e792b442

10
code/kibana-extra/code/.gitignore vendored Normal file
View file

@ -0,0 +1,10 @@
npm-debug.log*
node_modules
/build/
*.js
!packages/code-esqueue/**/*.js
!packages/code-filename-check/**/*.js
!webpackShims/init-monaco.js
!server/lib/**/*.js
/public/styles.css
yarn-error.log

View file

@ -0,0 +1,5 @@
{
"singleQuote": true,
"trailingComma": "es5",
"printWidth": 100
}

View file

@ -0,0 +1,91 @@
buildscript {
repositories {
jcenter()
}
}
plugins {
id "com.moowork.node" version "1.2.0"
}
node {
version = rootProject.nodeVersion
yarnVersion = rootProject.yarnVersion
download = true
}
task bootstrap(type: YarnTask, dependsOn: [yarnSetup, rootProject.tasks.checkoutSubmodule]) {
inputs.property("kibanaVersion", kibanaVersion)
inputs.files(
file("$rootDir/kibana-extra/code/package.json"),
file("$rootDir/kibana-extra/code/yarn.lock"),
file("$rootDir/kibana/yarn.lock"),
file("$rootDir/kibana/x-pack/yarn.lock"),
fileTree("$rootDir/kibana/packages") {
include "*/yarn.lock"
}
)
outputs.files(
file("$rootDir/kibana-extra/code/node_modules/.yarn-integrity"),
file("$rootDir/kibana/node_modules/.yarn-integrity"),
file("$rootDir/kibana/x-pack/node_modules/.yarn-integrity"),
fileTree("$rootDir/kibana/packages") {
include "*/node_modules/.yarn-integrity"
}
)
args = ['kbn', 'bootstrap']
}
task startKibana(type: YarnTask, dependsOn: [bootstrap]) {
args = ['start', '--logging.json=false']
}
if (!Boolean.valueOf(System.getenv('LSP_DETACH'))) {
tasks.startKibana.dependsOn(":lsp:javascript-typescript-langserver:build")
}
task startFullKibana(type: YarnTask, dependsOn: [bootstrap, ":lsp:javascript-typescript-langserver:build", ":lsp:eclipse.jdt.ls:build"]) {
args = ['start', '--logging.json=false']
}
task debugKibana(type: YarnTask, dependsOn: [bootstrap, ":lsp:javascript-typescript-langserver:build"]) {
args = ['debug', '--logging.json=false']
}
task test(type: YarnTask) {
args = ['test']
}
task startDeps(type: YarnTask, dependsOn: [bootstrap]) {
args = ['start-deps']
}
task startCacheDeps(type: YarnTask, dependsOn: [bootstrap]) {
args = ['start-cache-deps']
}
task checkAllFilenames(type: YarnTask, dependsOn: [bootstrap]) {
args = ['check_all_filenames']
}
task lint(type: YarnTask, dependsOn: [bootstrap]) {
args = ['tslint']
}
task lintFix(type: YarnTask, dependsOn: [bootstrap]) {
args = ['tslint', '--fix']
}
task typeCheck(type: YarnTask, dependsOn: [bootstrap]) {
args = ['type-check']
}
task typeCheckWatch(type: YarnTask, dependsOn: [bootstrap]) {
args = ['type-check', '--watch']
}
tasks.withType(YarnTask) { task ->
task.environment = ['FORCE_COLOR': 'true']
}

View file

@ -0,0 +1,19 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export interface GitBlame {
committer: {
name: string;
email: string;
};
startLine: number;
lines: number;
commit: {
id: string;
message: string;
date: string;
};
}

View file

@ -0,0 +1,36 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
interface Commit {
sha: string;
author: string;
message: string;
date: Date;
}
export interface CommitDiff {
commit: Commit;
additions: number;
deletions: number;
files: FileDiff[];
}
export interface FileDiff {
path: string;
originPath?: string;
kind: DiffKind;
originCode?: string;
modifiedCode?: string;
language?: string;
additions: number;
deletions: number;
}
export enum DiffKind {
ADDED,
DELETED,
MODIFIED,
RENAMED,
}

View file

@ -0,0 +1,34 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { isValidGitUrl } from './git_url_utils';
test('Git url validation', () => {
// An url ends with .git
expect(isValidGitUrl('https://github.com/elastic/elasticsearch.git')).toBeTruthy();
// An url ends without .git
expect(isValidGitUrl('https://github.com/elastic/elasticsearch')).toBeTruthy();
// An url with http://
expect(isValidGitUrl('http://github.com/elastic/elasticsearch')).toBeTruthy();
// An url with ssh://
expect(isValidGitUrl('ssh://elastic@github.com/elastic/elasticsearch.git')).toBeTruthy();
// An url with ssh:// and port
expect(isValidGitUrl('ssh://elastic@github.com:9999/elastic/elasticsearch.git')).toBeTruthy();
// An url with git://
expect(isValidGitUrl('git://elastic@github.com/elastic/elasticsearch.git')).toBeTruthy();
// An url with an invalid protocol
expect(isValidGitUrl('file:///Users/elastic/elasticsearch')).toBeFalsy();
// An url without protocol
expect(isValidGitUrl('/Users/elastic/elasticsearch')).toBeFalsy();
expect(isValidGitUrl('github.com/elastic/elasticsearch')).toBeFalsy();
});

View file

@ -0,0 +1,10 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export function isValidGitUrl(url: string): boolean {
const regex = /(?:git|ssh|https?|git@[-\w.]+):(\/\/)?(.*?)(\.git)?(\/?|\#[-\d\w._]+?)$/;
return regex.test(url);
}

View file

@ -0,0 +1,25 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export enum InstallationType {
Embed,
Download,
}
export enum InstallEventType {
DOWNLOADING,
UNPACKING,
DONE,
FAIL,
}
export interface InstallEvent {
langServerName: string;
eventType: InstallEventType;
progress?: number;
message?: string;
params?: any;
}

View file

@ -0,0 +1,23 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { InstallationType } from './installation';
export enum LanguageServerStatus {
NOT_INSTALLED,
INSTALLING,
READY, // installed but not running
RUNNING,
}
export interface LanguageServer {
name: string;
languages: string[];
installationType: InstallationType;
version?: string;
build?: string;
status?: LanguageServerStatus;
}

View file

@ -0,0 +1,37 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import _ from 'lodash';
import { SourceLocation } from '../model';
export class LineMapper {
private lines: string[];
private acc: number[];
constructor(content: string) {
this.lines = content.split('\n');
this.acc = [0];
this.getLocation = this.getLocation.bind(this);
for (let i = 0; i < this.lines.length - 1; i++) {
this.acc[i + 1] = this.acc[i] + this.lines[i].length + 1;
}
}
public getLocation(offset: number): SourceLocation {
let line = _.sortedIndex(this.acc, offset);
if (offset !== this.acc[line]) {
line -= 1;
}
const column = offset - this.acc[line];
return { line, column, offset };
}
public getLines(): string[] {
return this.lines;
}
}

View file

@ -0,0 +1,44 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { ResponseError, ResponseMessage } from 'vscode-jsonrpc/lib/messages';
export { TextDocumentMethods } from './text_document_methods';
import { kfetch } from 'ui/kfetch';
export interface LspClient {
sendRequest(method: string, params: any, singal?: AbortSignal): Promise<ResponseMessage>;
}
export class LspRestClient implements LspClient {
private baseUri: string;
constructor(baseUri: string) {
this.baseUri = baseUri;
}
public async sendRequest(
method: string,
params: any,
signal?: AbortSignal
): Promise<ResponseMessage> {
try {
const response = await kfetch({
pathname: `${this.baseUri}/${method}`,
method: 'POST',
body: JSON.stringify(params),
signal,
});
return response as ResponseMessage;
} catch (e) {
let error = e;
if (error.body && error.body.error) {
error = error.body.error;
}
throw new ResponseError<any>(error.code, error.message, error.data);
}
}
}

View file

@ -0,0 +1,12 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { ErrorCodes } from 'vscode-jsonrpc/lib/messages';
export const ServerNotInitialized: number = ErrorCodes.ServerNotInitialized;
export const UnknownErrorCode: number = ErrorCodes.UnknownErrorCode;
export const UnknownFileLanguage: number = -42404;
export const LanguageServerNotInstalled: number = -42403;

View file

@ -0,0 +1,39 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { AsyncTask } from '../public/monaco/computer';
import { LspClient } from './lsp_client';
export class LspMethod<INPUT, OUTPUT> {
private client: LspClient;
private method: string;
constructor(method: string, client: LspClient) {
this.client = client;
this.method = method;
}
public asyncTask(input: INPUT): AsyncTask<OUTPUT> {
const abortController = new AbortController();
const promise = () => {
return this.client
.sendRequest(this.method, input, abortController.signal)
.then(result => result.result as OUTPUT);
};
return {
cancel() {
abortController.abort();
},
promise,
};
}
public async send(input: INPUT): Promise<OUTPUT> {
return await this.client
.sendRequest(this.method, input)
.then(result => result.result as OUTPUT);
}
}

View file

@ -0,0 +1,96 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { RepositoryUtils } from './repository_utils';
test('Repository url parsing', () => {
// Valid git url without .git suffix.
const repo1 = RepositoryUtils.buildRepository('https://github.com/apache/sqoop');
expect(repo1).toEqual({
uri: 'github.com/apache/sqoop',
url: 'https://github.com/apache/sqoop',
name: 'sqoop',
org: 'apache',
});
// Valid git url with .git suffix.
const repo2 = RepositoryUtils.buildRepository('https://github.com/apache/sqoop.git');
expect(repo2).toEqual({
uri: 'github.com/apache/sqoop',
url: 'https://github.com/apache/sqoop.git',
name: 'sqoop',
org: 'apache',
});
// An invalid git url
const repo3 = RepositoryUtils.buildRepository('github.com/apache/sqoop');
expect(repo3).toMatchObject({
uri: 'github.com/apache/sqoop',
url: 'http://github.com/apache/sqoop',
});
const repo4 = RepositoryUtils.buildRepository('git://a/b');
expect(repo4).toEqual({
uri: 'a/_/b',
url: 'git://a/b',
name: 'b',
org: '_',
});
const repo5 = RepositoryUtils.buildRepository('git://a/b/c');
expect(repo5).toEqual({
uri: 'a/b/c',
url: 'git://a/b/c',
name: 'c',
org: 'b',
});
});
test('Repository url parsing with non standard segments', () => {
const repo1 = RepositoryUtils.buildRepository('git://a/b/c/d');
expect(repo1).toEqual({
uri: 'a/b_c/d',
url: 'git://a/b/c/d',
name: 'd',
org: 'b_c',
});
const repo2 = RepositoryUtils.buildRepository('git://a/b/c/d/e');
expect(repo2).toEqual({
uri: 'a/b_c_d/e',
url: 'git://a/b/c/d/e',
name: 'e',
org: 'b_c_d',
});
const repo3 = RepositoryUtils.buildRepository('git://a');
expect(repo3).toEqual({
uri: 'a/_/_',
url: 'git://a',
name: '_',
org: '_',
});
});
test('Repository url parsing with port', () => {
const repo1 = RepositoryUtils.buildRepository('ssh://mine@mydomain.com:27017/gitolite-admin');
expect(repo1).toEqual({
uri: 'mydomain.com:27017/mine/gitolite-admin',
url: 'ssh://mine@mydomain.com:27017/gitolite-admin',
name: 'gitolite-admin',
org: 'mine',
});
const repo2 = RepositoryUtils.buildRepository(
'ssh://mine@mydomain.com:27017/elastic/gitolite-admin'
);
expect(repo2).toEqual({
uri: 'mydomain.com:27017/elastic/gitolite-admin',
url: 'ssh://mine@mydomain.com:27017/elastic/gitolite-admin',
name: 'gitolite-admin',
org: 'elastic',
});
});

View file

@ -0,0 +1,97 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import GitUrlParse from 'git-url-parse';
import path from 'path';
import { Location } from 'vscode-languageserver';
import { CloneProgress, FileTree, FileTreeItemType, Repository, RepositoryUri } from '../model';
import { parseLspUrl, toCanonicalUrl } from './uri_util';
export class RepositoryUtils {
// Generate a Repository instance by parsing repository remote url
public static buildRepository(remoteUrl: string): Repository {
const repo = GitUrlParse(remoteUrl);
let host = repo.source ? repo.source : '';
if (repo.port !== null) {
host = host + ':' + repo.port;
}
const name = repo.name ? repo.name : '_';
const org = repo.owner ? repo.owner.split('/').join('_') : '_';
const uri: RepositoryUri = host ? `${host}/${org}/${name}` : repo.full_name;
return {
uri,
url: repo.href as string,
name,
org,
};
}
// From uri 'origin/org/name' to 'name'
public static repoNameFromUri(repoUri: RepositoryUri): string {
const segs = repoUri.split('/');
if (segs && segs.length === 3) {
return segs[2];
} else {
return 'invalid';
}
}
// From uri 'origin/org/name' to 'org/name'
public static repoFullNameFromUri(repoUri: RepositoryUri): string {
const segs = repoUri.split('/');
if (segs && segs.length === 3) {
return segs[1] + '/' + segs[2];
} else {
return 'invalid';
}
}
// Return the local data path of a given repository.
public static repositoryLocalPath(repoPath: string, repoUri: RepositoryUri) {
return path.join(repoPath, repoUri);
}
public static normalizeRepoUriToIndexName(repoUri: RepositoryUri) {
return repoUri
.split('/')
.join('-')
.toLowerCase();
}
public static locationToUrl(loc: Location) {
const url = parseLspUrl(loc.uri);
const { repoUri, file, revision } = url;
if (repoUri && file && revision) {
return toCanonicalUrl({ repoUri, file, revision, position: loc.range.start });
}
return '';
}
public static getAllFiles(fileTree: FileTree): string[] {
if (!fileTree) {
return [];
}
let result: string[] = [];
switch (fileTree.type) {
case FileTreeItemType.File:
result.push(fileTree.path!);
break;
case FileTreeItemType.Directory:
for (const node of fileTree.children!) {
result = result.concat(RepositoryUtils.getAllFiles(node));
}
break;
default:
break;
}
return result;
}
public static hasFullyCloned(cloneProgress?: CloneProgress | null): boolean {
return !!cloneProgress && cloneProgress.isCloned !== undefined && cloneProgress.isCloned;
}
}

View file

@ -0,0 +1,36 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { SymbolLocator } from '@code/lsp-extension';
import { TextDocumentPositionParams } from 'vscode-languageserver';
import {
Definition,
DocumentSymbolParams,
Hover,
Location,
SymbolInformation,
} from 'vscode-languageserver-types';
import { LspClient } from './lsp_client';
import { LspMethod } from './lsp_method';
export class TextDocumentMethods {
public documentSymbol: LspMethod<DocumentSymbolParams, SymbolInformation[]>;
public hover: LspMethod<TextDocumentPositionParams, Hover>;
public definition: LspMethod<TextDocumentPositionParams, Definition>;
public edefinition: LspMethod<TextDocumentPositionParams, SymbolLocator[]>;
public references: LspMethod<TextDocumentPositionParams, Location[]>;
private readonly client: LspClient;
constructor(client: LspClient) {
this.client = client;
this.documentSymbol = new LspMethod('textDocument/documentSymbol', this.client);
this.hover = new LspMethod('textDocument/hover', this.client);
this.definition = new LspMethod('textDocument/definition', this.client);
this.edefinition = new LspMethod('textDocument/edefinition', this.client);
this.references = new LspMethod('textDocument/references', this.client);
}
}

View file

@ -0,0 +1,86 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { RepositoryUri } from '../model';
import { parseLspUrl, toCanonicalUrl, toRepoName, toRepoNameWithOrg } from './uri_util';
test('parse a complete uri', () => {
const fullUrl =
'git://github.com/Microsoft/vscode/blob/f2e49a2/src/vs/base/parts/ipc/test/node/ipc.net.test.ts';
const result = parseLspUrl(fullUrl);
expect(result).toEqual({
uri:
'/github.com/Microsoft/vscode/blob/f2e49a2/src/vs/base/parts/ipc/test/node/ipc.net.test.ts',
repoUri: 'github.com/Microsoft/vscode',
pathType: 'blob',
revision: 'f2e49a2',
file: 'src/vs/base/parts/ipc/test/node/ipc.net.test.ts',
schema: 'git:',
});
});
test('parseLspUrl a uri without schema', () => {
const url =
'github.com/Microsoft/vscode/blob/f2e49a2/src/vs/base/parts/ipc/test/node/ipc.net.test.ts';
const result = parseLspUrl(url);
expect(result).toEqual({
uri:
'/github.com/Microsoft/vscode/blob/f2e49a2/src/vs/base/parts/ipc/test/node/ipc.net.test.ts',
repoUri: 'github.com/Microsoft/vscode',
pathType: 'blob',
revision: 'f2e49a2',
file: 'src/vs/base/parts/ipc/test/node/ipc.net.test.ts',
});
});
test('parseLspUrl a tree uri', () => {
const uri = 'github.com/Microsoft/vscode/tree/head/src';
const result = parseLspUrl(uri);
expect(result).toEqual({
uri: '/github.com/Microsoft/vscode/tree/head/src',
repoUri: 'github.com/Microsoft/vscode',
pathType: 'tree',
revision: 'head',
file: 'src',
});
});
test('touri', () => {
const uri =
'git://github.com/Microsoft/vscode/blob/f2e49a2/src/vs/base/parts/ipc/test/node/ipc.net.test.ts';
const result = parseLspUrl(uri);
expect(result).toEqual({
uri:
'/github.com/Microsoft/vscode/blob/f2e49a2/src/vs/base/parts/ipc/test/node/ipc.net.test.ts',
repoUri: 'github.com/Microsoft/vscode',
pathType: 'blob',
revision: 'f2e49a2',
file: 'src/vs/base/parts/ipc/test/node/ipc.net.test.ts',
schema: 'git:',
});
const convertBack = toCanonicalUrl(result!);
expect(convertBack).toEqual(uri);
});
test('toRepoName', () => {
const uri: RepositoryUri = 'github.com/elastic/elasticsearch';
expect(toRepoName(uri)).toEqual('elasticsearch');
const invalidUri: RepositoryUri = 'github.com/elastic/elasticsearch/invalid';
expect(() => {
toRepoName(invalidUri);
}).toThrow();
});
test('toRepoNameWithOrg', () => {
const uri: RepositoryUri = 'github.com/elastic/elasticsearch';
expect(toRepoNameWithOrg(uri)).toEqual('elastic/elasticsearch');
const invalidUri: RepositoryUri = 'github.com/elastic/elasticsearch/invalid';
expect(() => {
toRepoNameWithOrg(invalidUri);
}).toThrow();
});

View file

@ -0,0 +1,131 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { Uri } from 'monaco-editor';
import pathToRegexp from 'path-to-regexp';
import { Position } from 'vscode-languageserver-types';
import { RepositoryUri } from '../model';
import { MAIN, MAIN_ROOT } from '../public/components/routes';
const mainRe = pathToRegexp(MAIN);
const mainRootRe = pathToRegexp(MAIN_ROOT);
export interface ParsedUrl {
schema?: string;
uri?: string;
}
export interface CompleteParsedUrl extends ParsedUrl {
repoUri: string;
revision: string;
pathType?: string;
file?: string;
schema?: string;
position?: Position;
}
export function parseSchema(url: string): { uri: string; schema?: string } {
let [schema, uri] = url.toString().split('//');
if (!uri) {
uri = schema;
// @ts-ignore
schema = undefined;
}
if (!uri.startsWith('/')) {
uri = '/' + uri;
}
return { uri, schema };
}
export function parseGoto(goto: string): Position | undefined {
const regex = /L(\d+)(:\d+)?$/;
const m = regex.exec(goto);
if (m) {
const line = parseInt(m[1], 10);
let character = 0;
if (m[2]) {
character = parseInt(m[2].substring(1), 10);
}
return {
line,
character,
};
}
}
export function parseLspUrl(url: Uri | string): CompleteParsedUrl {
const { schema, uri } = parseSchema(url.toString());
const mainParsed = mainRe.exec(uri);
const mainRootParsed = mainRootRe.exec(uri);
if (mainParsed) {
const [resource, org, repo, pathType, revision, file, goto] = mainParsed.slice(1);
let position;
if (goto) {
position = parseGoto(goto);
}
return {
uri: uri.replace(goto, ''),
repoUri: `${resource}/${org}/${repo}`,
pathType,
revision,
file,
schema,
position,
};
} else if (mainRootParsed) {
const [resource, org, repo, pathType, revision] = mainRootParsed.slice(1);
return {
uri,
repoUri: `${resource}/${org}/${repo}`,
pathType,
revision,
schema,
};
} else {
throw new Error('invalid url ' + url);
}
}
/*
* From RepositoryUri to repository name.
* e.g. github.com/elastic/elasticsearch -> elasticsearch
*/
export function toRepoName(uri: RepositoryUri): string {
const segs = uri.split('/');
if (segs.length !== 3) {
throw new Error(`Invalid repository uri ${uri}`);
}
return segs[2];
}
/*
* From RepositoryUri to repository name with organization prefix.
* e.g. github.com/elastic/elasticsearch -> elastic/elasticsearch
*/
export function toRepoNameWithOrg(uri: RepositoryUri): string {
const segs = uri.split('/');
if (segs.length !== 3) {
throw new Error(`Invalid repository uri ${uri}`);
}
return `${segs[1]}/${segs[2]}`;
}
const compiled = pathToRegexp.compile(MAIN);
export function toCanonicalUrl(lspUrl: CompleteParsedUrl) {
const [resource, org, repo] = lspUrl.repoUri!.split('/');
if (!lspUrl.pathType) {
lspUrl.pathType = 'blob';
}
let goto;
if (lspUrl.position) {
goto = `!L${lspUrl.position.line + 1}:${lspUrl.position.character}`;
}
const data = { resource, org, repo, path: lspUrl.file, goto, ...lspUrl };
const uri = decodeURIComponent(compiled(data));
return lspUrl.schema ? `${lspUrl.schema}/${uri}` : uri;
}

View file

@ -0,0 +1,185 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { EsClient, Esqueue } from '@code/esqueue';
import moment from 'moment';
import { resolve } from 'path';
import {
LspIndexerFactory,
RepositoryIndexInitializerFactory,
tryMigrateIndices,
} from './server/indexer';
import { Server } from './server/kibana_types';
import { Log } from './server/log';
import { InstallManager } from './server/lsp/install_manager';
import { LspService } from './server/lsp/lsp_service';
import {
CancellationSerivce,
CloneWorker,
DeleteWorker,
IndexWorker,
UpdateWorker,
} from './server/queue';
import { fileRoute } from './server/routes/file';
import { installRoute } from './server/routes/install';
import { lspRoute, symbolByQnameRoute } from './server/routes/lsp';
import { redirectRoute } from './server/routes/redirect';
import { redirectSocketRoute } from './server/routes/redirect_socket';
import { repositoryRoute } from './server/routes/repository';
import {
documentSearchRoute,
repositorySearchRoute,
symbolSearchRoute,
} from './server/routes/search';
import { socketRoute } from './server/routes/socket';
import { userRoute } from './server/routes/user';
import { workspaceRoute } from './server/routes/workspace';
import { IndexScheduler, UpdateScheduler } from './server/scheduler';
import { DocumentSearchClient, RepositorySearchClient, SymbolSearchClient } from './server/search';
import { ServerOptions } from './server/server_options';
import { SocketService } from './server/socket_service';
import { ServerLoggerFactory } from './server/utils/server_logger_factory';
// tslint:disable-next-line no-default-export
export default (kibana: any) =>
new kibana.Plugin({
require: ['elasticsearch'],
name: 'code',
publicDir: resolve(__dirname, 'public'),
uiExports: {
app: {
title: 'Code',
description: 'Code Search Plugin',
main: 'plugins/code/app',
},
styleSheetPaths: resolve(__dirname, 'public/styles.scss'),
},
config(Joi: any) {
return Joi.object({
enabled: Joi.boolean().default(true),
queueIndex: Joi.string().default('.code-worker-queue'),
// 1 hour by default.
queueTimeout: Joi.number().default(moment.duration(1, 'hour').asMilliseconds()),
// The frequency which update scheduler executes. 5 minutes by default.
updateFrequencyMs: Joi.number().default(moment.duration(5, 'minute').asMilliseconds()),
// The frequency which index scheduler executes. 1 day by default.
indexFrequencyMs: Joi.number().default(moment.duration(1, 'day').asMilliseconds()),
// The frequency which each repo tries to update. 1 hour by default.
updateRepoFrequencyMs: Joi.number().default(moment.duration(1, 'hour').asMilliseconds()),
// The frequency which each repo tries to index. 1 day by default.
indexRepoFrequencyMs: Joi.number().default(moment.duration(1, 'day').asMilliseconds()),
// timeout a request over 30s.
lspRequestTimeoutMs: Joi.number().default(moment.duration(10, 'second').asMilliseconds()),
repos: Joi.array().default([]),
maxWorkspace: Joi.number().default(5), // max workspace folder for each language server
isAdmin: Joi.boolean().default(true), // If we show the admin buttons
disableScheduler: Joi.boolean().default(true), // Temp option to disable all schedulers.
enableGlobalReference: Joi.boolean().default(false), // Global reference as optional feature for now
redirectToNode: Joi.string(),
}).default();
},
init: async (server: Server, options: any) => {
const queueIndex = server.config().get('code.queueIndex');
const queueTimeout = server.config().get('code.queueTimeout');
const adminCluster = server.plugins.elasticsearch.getCluster('admin');
const dataCluster = server.plugins.elasticsearch.getCluster('data');
const log = new Log(server);
const serverOptions = new ServerOptions(options, server.config());
if (serverOptions.redirectToNode) {
log.info(
`redirect node enabled,all requests will be redirected to ${serverOptions.redirectToNode}`
);
redirectRoute(server, options, log);
await redirectSocketRoute(server, options, log);
return;
}
const socketService = new SocketService(server, log);
// Initialize search clients
const repoSearchClient = new RepositorySearchClient(dataCluster.getClient(), log);
const documentSearchClient = new DocumentSearchClient(dataCluster.getClient(), log);
const symbolSearchClient = new SymbolSearchClient(dataCluster.getClient(), log);
const esClient: EsClient = adminCluster.getClient();
const installManager = new InstallManager(serverOptions);
const lspService = new LspService(
'127.0.0.1',
serverOptions,
esClient,
installManager,
new ServerLoggerFactory(server)
);
// Initialize indexing factories.
const lspIndexerFactory = new LspIndexerFactory(lspService, serverOptions, esClient, log);
const repoIndexInitializerFactory = new RepositoryIndexInitializerFactory(esClient, log);
// Initialize queue worker cancellation service.
const cancellationService = new CancellationSerivce();
// Execute index version checking and try to migrate index data if necessary.
await tryMigrateIndices(esClient, log);
// Initialize queue.
const queue = new Esqueue(queueIndex, {
client: esClient,
timeout: queueTimeout,
doctype: 'esqueue',
});
const indexWorker = new IndexWorker(
queue,
log,
esClient,
[lspIndexerFactory],
cancellationService,
socketService
).bind();
const cloneWorker = new CloneWorker(queue, log, esClient, indexWorker, socketService).bind();
const deleteWorker = new DeleteWorker(
queue,
log,
esClient,
cancellationService,
lspService,
socketService
).bind();
const updateWorker = new UpdateWorker(queue, log, esClient).bind();
// Initialize schedulers.
const updateScheduler = new UpdateScheduler(updateWorker, serverOptions, esClient, log);
const indexScheduler = new IndexScheduler(indexWorker, serverOptions, esClient, log);
if (!serverOptions.disableScheduler) {
updateScheduler.start();
indexScheduler.start();
}
// Add server routes and initialize the plugin here
repositoryRoute(
server,
serverOptions,
cloneWorker,
deleteWorker,
indexWorker,
repoIndexInitializerFactory
);
repositorySearchRoute(server, repoSearchClient);
documentSearchRoute(server, documentSearchClient);
symbolSearchRoute(server, symbolSearchClient);
fileRoute(server, serverOptions);
workspaceRoute(server, serverOptions, esClient);
symbolByQnameRoute(server, symbolSearchClient);
socketRoute(server, socketService, log);
userRoute(server, serverOptions);
installRoute(server, socketService, lspService, installManager, serverOptions);
lspRoute(server, lspService, serverOptions);
},
});

View file

@ -0,0 +1,26 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export interface CommitInfo {
updated: Date;
message: string;
committer: string;
id: string;
}
export interface ReferenceInfo {
name: string;
reference: string;
commit: CommitInfo;
type: ReferenceType;
}
export enum ReferenceType {
BRANCH,
TAG,
REMOTE_BRANCH,
OTHER,
}

View file

@ -0,0 +1,19 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export interface CodeLine extends Array<Token> {}
export interface Token {
value: string;
scopes: string[];
range?: Range;
}
export interface Range {
start: number; // start pos in line
end: number;
pos?: number; // position in file
}

View file

@ -0,0 +1,13 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export * from './highlight';
export * from './search';
export * from './repository';
export * from './task';
export * from './lsp';
export * from './workspace';
export * from './socket';

View file

@ -0,0 +1,16 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export interface LspRequest {
method: string;
params: any;
documentUri?: string; // assert there is only one uri per request for now.
resolvedFilePath?: string;
workspacePath?: string;
workspaceRevision?: string;
isNotification?: boolean; // if this is a notification request that doesn't need response
timeoutForInitializeMs?: number; // If the language server is initialize, how many milliseconds should we wait for it. Default infinite.
}

View file

@ -0,0 +1,110 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export type RepositoryUri = string;
export interface Repository {
/** In the form of git://github.com/lambdalab/lambdalab */
uri: RepositoryUri;
/** Original Clone Url */
url: string;
name?: string;
org?: string;
defaultBranch?: string;
revision?: string;
// The timestamp of next update for this repository.
nextUpdateTimestamp?: Date;
// The timestamp of next index for this repository.
nextIndexTimestamp?: Date;
}
export interface RepositoryConfig {
uri: RepositoryUri;
disableJava?: boolean;
disableTypescript?: boolean;
}
export interface FileTree {
name: string;
type: FileTreeItemType;
/** Full Path of the tree, don't need to be set by the server */
path?: string;
/**
* Children of the file tree, if it is undefined, then it's a file, if it is null, it means it is a
* directory and its children haven't been evaluated.
*/
children?: FileTree[];
/**
* count of children nodes for current node, use this for pagination
*/
childrenCount?: number;
sha1?: string;
/**
* current repo uri
*/
repoUri?: string;
}
export enum FileTreeItemType {
File,
Directory,
Submodule,
}
export interface WorkerResult {
uri: string;
}
// TODO(mengwei): create a AbstractGitWorkerResult since we now have an
// AbstractGitWorker now.
export interface CloneWorkerResult extends WorkerResult {
repo: Repository;
}
export interface DeleteWorkerResult extends WorkerResult {
res: boolean;
}
export interface UpdateWorkerResult extends WorkerResult {
branch: string;
revision: string;
}
export enum IndexStatsKey {
File = 'file-count',
Symbol = 'symbol-count',
Reference = 'reference-count',
}
export type IndexStats = Map<IndexStatsKey, number>;
export interface IndexWorkerResult extends WorkerResult {
revision: string;
stats: IndexStats;
}
export interface WorkerProgress {
// Job payload repository uri.
uri: string;
progress: number;
timestamp: Date;
revision?: string;
}
export interface CloneProgress {
isCloned?: boolean;
receivedObjects: number;
indexedObjects: number;
totalObjects: number;
localObjects: number;
totalDeltas: number;
indexedDeltas: number;
receivedBytes: number;
}
export interface CloneWorkerProgress extends WorkerProgress {
cloneProgress?: CloneProgress;
}

View file

@ -0,0 +1,132 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { DetailSymbolInformation } from '@code/lsp-extension';
import { IRange } from 'monaco-editor';
import { Repository, SourceHit } from '../model';
import { RepositoryUri } from './repository';
export interface Document {
repoUri: RepositoryUri;
path: string;
content: string;
qnames: string[];
language?: string;
sha1?: string;
}
// The base interface of indexer requests
export interface IndexRequest {
repoUri: RepositoryUri;
}
// The request for LspIndexer
export interface LspIndexRequest extends IndexRequest {
localRepoPath: string; // The repository local file path
filePath: string; // The file path within the repository
revision: string; // The revision of the current repository
}
// The request for RepositoryIndexer
export interface RepositoryIndexRequest extends IndexRequest {
repoUri: RepositoryUri;
}
// The base interface of any kind of search requests.
export interface SearchRequest {
query: string;
page: number;
resultsPerPage?: number;
}
export interface RepositorySearchRequest extends SearchRequest {
query: string;
}
export interface DocumentSearchRequest extends SearchRequest {
query: string;
repoFileters?: string[];
langFilters?: string[];
}
export interface SymbolSearchRequest extends SearchRequest {
query: string;
}
// The base interface of any kind of search result.
export interface SearchResult {
total: number;
took: number;
}
export interface RepositorySearchResult extends SearchResult {
repositories: Repository[];
}
export interface SymbolSearchResult extends SearchResult {
// TODO: we migit need an additional data structure for symbol search result.
symbols: DetailSymbolInformation[];
}
// All the interfaces for search page
// The item of the search result stats. e.g. Typescript -> 123
export interface SearchResultStatsItem {
name: string;
value: number;
}
export interface SearchResultStats {
total: number; // Total number of results
from: number; // The beginning of the result range
to: number; // The end of the result range
page: number; // The page number
totalPage: number; // The total number of pages
repoStats: SearchResultStatsItem[];
languageStats: SearchResultStatsItem[];
}
export interface CompositeSourceContent {
content: string;
lineMapping: string[];
ranges: IRange[];
}
export interface SearchResultItem {
uri: string;
hits: number;
filePath: string;
language: string;
compositeContent: CompositeSourceContent;
}
export interface DocumentSearchResult extends SearchResult {
query: string;
from?: number;
page?: number;
totalPage?: number;
stats?: SearchResultStats;
results?: SearchResultItem[];
repoAggregations?: any[];
langAggregations?: any[];
}
export interface SourceLocation {
line: number;
column: number;
offset: number;
}
export interface SourceRange {
startLoc: SourceLocation;
endLoc: SourceLocation;
}
export interface SourceHit {
range: SourceRange;
score: number;
term: string;
}

View file

@ -0,0 +1,12 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export enum SocketKind {
CLONE_PROGRESS = 'clone-progress',
DELETE_PROGRESS = 'delete-progress',
INDEX_PROGRESS = 'index-progress',
INSTALL_PROGRESS = 'install-progress',
}

View file

@ -0,0 +1,25 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { RepositoryUri } from './repository';
/** Time consuming task that should be queued and executed seperately */
export interface Task {
repoUri: RepositoryUri;
type: TaskType;
/** Percentage of the task, 100 means task completed */
progress: number;
/** Revision of the repo that the task run on. May only apply to Index task */
revision?: string;
}
export enum TaskType {
Import,
Update,
Delete,
Index,
}

View file

@ -0,0 +1,21 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export interface Repo {
url: string;
path: string;
language: string;
}
export interface TestConfig {
repos: Repo[];
}
export enum RequestType {
INITIALIZE,
HOVER,
FULL,
}

View file

@ -0,0 +1,15 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export type RepoCmd = string | string[];
export interface RepoConfig {
repo: string;
init: RepoCmd;
}
export interface RepoConfigs {
[repoUri: string]: RepoConfig;
}

View file

@ -0,0 +1,140 @@
{
"name": "code",
"version": "0.0.1",
"description": "code",
"main": "index.js",
"private": true,
"license": "Elastic-License",
"kibana": {
"version": "kibana",
"templateVersion": "1.0.0"
},
"codeIndexVersion": "1",
"scripts": {
"_preinstall": "node ../../kibana/preinstall_check",
"kbn": "node ../../kibana/scripts/kbn",
"tslint": "tslint --project tsconfig.json --format stylish",
"start": "plugin-helpers start --config ../config/kibana/kibana.yml",
"debug-another": "node --inspect ../../kibana/scripts/kibana.js -p 5701 --config ../../config/kibana/kibana.another.yml --optimize.enabled=false --env.name=development --plugin-path=.",
"another": "node ../../kibana/scripts/kibana.js -p 5701 --config ../../config/kibana/kibana.another.yml --env.name=development --plugin-path=.",
"test:server": "plugin-helpers test:server",
"test:browser": "plugin-helpers test:browser",
"precommit": "lint-staged",
"prepush": "yarn test",
"build": "plugin-helpers build",
"start-deps": "node ../../kibana/scripts/es.js snapshot -E path.data=../../data",
"start-cache-deps": "node ../../kibana/scripts/es.js archive ../../kibana/.es/cache/elasticsearch-7.0.0-alpha1-SNAPSHOT.tar.gz -E path.data=../../data",
"debug": "node --inspect ../../kibana/scripts/kibana.js --config ../../config/kibana/kibana.yml --env.name=development --optimize.enabled=false --plugin-path=.",
"type-check": "tsc --project tsconfig.json --noEmit --pretty",
"test": "jest",
"lsp:benchmark": "cd ./server/lsp && jest --testRegex lsp_benchmark.ts"
},
"jest": {
"globals": {
"ts-jest": {
"diagnostics": false
}
},
"transform": {
"^.+\\.tsx?$": "ts-jest"
},
"testRegex": "(\\.|/)test\\.(jsx?|tsx?)$",
"moduleFileExtensions": [
"ts",
"tsx",
"js",
"jsx",
"json",
"node"
]
},
"lint-staged": {
"*.{ts,tsx}": "yarn run tslint",
"*.{js,ts,tsx}": "check_precommit_filenames"
},
"devDependencies": {
"@code/filename-check": "link:packages/code-filename-check",
"@elastic/eui": "5.0.0",
"@kbn/plugin-helpers": "link:../../kibana/packages/kbn-plugin-helpers",
"@types/angular": "^1.6.48",
"@types/boom": "7.2.0",
"@types/elasticsearch": "^5.0.24",
"@types/file-type": "^5.2.1",
"@types/get-port": "^3.2.0 ",
"@types/git-url-parse": "^9.0.0",
"@types/glob": "^5.0.35",
"@types/hapi": "^17.6.2",
"@types/jest": "^23.3.1",
"@types/js-yaml": "^3.11.2",
"@types/node": "^10.5.3",
"@types/nodegit": "^0.22.1",
"@types/papaparse": "^4.5.5",
"@types/proper-lockfile": "^3.0.0",
"@types/query-string": "^6.1.0",
"@types/react": "^16.3.16",
"@types/react-dom": "^16.0.6",
"@types/react-redux": "^6.0.2",
"@types/react-router-dom": "^4.2.7",
"@types/react-test-renderer": "^16.0.3",
"@types/redux-actions": "^2.3.0",
"@types/rimraf": "^2.0.2",
"@types/sinon": "^5.0.5",
"@types/socket.io": "^1.4.38",
"@types/socket.io-client": "^1.4.32",
"@types/styled-components": "^4.1.0",
"@types/tar-fs": "^1.16.1",
"expect.js": "^0.3.1",
"husky": "^0.14.3",
"jest": "^23.5.0",
"lint-staged": "^7.2.0",
"react-test-renderer": "^16.6.3",
"sinon": "^7.0.0",
"ts-jest": "^23.1.3",
"tslint": "^5.10.0",
"tslint-react": "^3.6.0",
"typescript": "^3.0.3"
},
"dependencies": {
"@code/esqueue": "link:packages/code-esqueue",
"@code/lsp-extension": "link:../../lsp/javascript-typescript-langserver/packages/lsp-extension",
"boom": "7.2.0",
"del": "^3.0.0",
"elasticsearch": "^15.1.1",
"file-type": "^8.1.0",
"get-port": "^3.2.0",
"git-url-parse": "^9.0.1",
"github-markdown-css": "^2.10.0",
"h2o2": "^8.1.2",
"highlights": "^3.1.1",
"history": "^4.7.2",
"immer": "^1.5.0",
"language-detect": "^1.1.0",
"moment": "^2.22.2",
"monaco-editor": "^0.14.3",
"nodegit": "git+https://github.com/elastic/nodegit.git",
"papaparse": "^4.6.2",
"path-to-regexp": "^2.2.1",
"popper.js": "^1.14.3",
"prop-types": "15.5.8",
"proper-lockfile": "^3.0.2",
"query-string": "^6.1.0",
"react": "^16.3.0",
"react-dom": "^16.4.1",
"react-markdown": "^3.4.1",
"react-redux": "^5.0.7",
"react-router": "^4.3.1",
"react-router-dom": "^4.3.1",
"redux": "^4.0.0",
"redux-actions": "^2.4.0",
"redux-saga": "^0.16.0",
"reselect": "^3.0.1",
"socket.io": "^2.1.1",
"socket.io-client": "^2.1.1",
"stats-lite": "^2.2.0",
"styled-components": "^3.4.5",
"tar-fs": "^1.16.3",
"vscode-jsonrpc": "^3.6.2",
"vscode-languageserver": "^4.2.1",
"vscode-languageserver-types": "^3.10.0"
}
}

View file

@ -0,0 +1,16 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export const defaultSettings = {
DEFAULT_SETTING_TIMEOUT: 10000,
DEFAULT_SETTING_DATE_SEPARATOR: '-',
DEFAULT_SETTING_INTERVAL: 'week',
DEFAULT_SETTING_DOCTYPE: 'esqueue',
DEFAULT_SETTING_INDEX_SETTINGS: {
number_of_shards: 1,
auto_expand_replicas: '0-1',
},
};

View file

@ -0,0 +1,24 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
declare class Events {
public EVENT_QUEUE_ERROR: 'queue:error';
public EVENT_JOB_ERROR: 'job:error';
public EVENT_JOB_CREATED: 'job:created';
public EVENT_JOB_CREATE_ERROR: 'job:creation error';
public EVENT_WORKER_COMPLETE: 'worker:job complete';
public EVENT_WORKER_JOB_CLAIM_ERROR: 'worker:claim job error';
public EVENT_WORKER_JOB_SEARCH_ERROR: 'worker:pending jobs error';
public EVENT_WORKER_JOB_UPDATE_ERROR: 'worker:update job error';
public EVENT_WORKER_JOB_FAIL: 'worker:job failed';
public EVENT_WORKER_JOB_FAIL_ERROR: 'worker:failed job update error';
public EVENT_WORKER_JOB_EXECUTION_ERROR: 'worker:job execution error';
public EVENT_WORKER_JOB_TIMEOUT: 'worker:job timeout';
}
declare const events: Events;
export { events };

View file

@ -0,0 +1,20 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export const events = {
EVENT_QUEUE_ERROR: 'queue:error',
EVENT_JOB_ERROR: 'job:error',
EVENT_JOB_CREATED: 'job:created',
EVENT_JOB_CREATE_ERROR: 'job:creation error',
EVENT_WORKER_COMPLETE: 'worker:job complete',
EVENT_WORKER_JOB_CLAIM_ERROR: 'worker:claim job error',
EVENT_WORKER_JOB_SEARCH_ERROR: 'worker:pending jobs error',
EVENT_WORKER_JOB_UPDATE_ERROR: 'worker:update job error',
EVENT_WORKER_JOB_FAIL: 'worker:job failed',
EVENT_WORKER_JOB_FAIL_ERROR: 'worker:failed job update error',
EVENT_WORKER_JOB_EXECUTION_ERROR: 'worker:job execution error',
EVENT_WORKER_JOB_TIMEOUT: 'worker:job timeout',
};

View file

@ -0,0 +1,15 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { events } from './events';
import { statuses } from './statuses';
import { defaultSettings } from './default_settings';
export const constants = {
...events,
...statuses,
...defaultSettings
};

View file

@ -0,0 +1,7 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export type Status = 'pending' | 'processing' | 'completed' | 'failed' | 'cancelled';

View file

@ -0,0 +1,13 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export const statuses = {
JOB_STATUS_PENDING: 'pending',
JOB_STATUS_PROCESSING: 'processing',
JOB_STATUS_COMPLETED: 'completed',
JOB_STATUS_FAILED: 'failed',
JOB_STATUS_CANCELLED: 'cancelled',
};

View file

@ -0,0 +1,68 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { EventEmitter } from 'events';
import { events } from './constants/events';
import { Job, JobOptions } from './job';
import { AnyObject, EsClient, LogFn } from './misc';
import { Worker, WorkerFn, WorkerOptions, WorkerOutput } from './worker';
export class Esqueue extends EventEmitter {
constructor(
/**
* The base name Esqueue will use for its time-based job indices in Elasticsearch. This
* will have a date string appended to it to determine the actual index name.
*/
index: string,
options: {
/**
* The Elasticsearch client EsQueue will use to query ES and manage its queue indices
*/
client: EsClient;
/**
* A function that Esqueue will call with log messages
*/
logger?: LogFn;
/**
* Interval that Esqueue will use when creating its time-based job indices in Elasticsearch
*/
interval?: 'year' | 'month' | 'week' | 'day' | 'hour' | 'minute';
/**
* Default job timeout
*/
timeout?: number;
/**
* The _type used by Esqueue for documents created in elasticsearch
*/
doctype?: string;
/**
* The value used to separate the parts of the date in index names created by Esqueue
*/
dateSeparator?: string;
/**
* Arbitrary settings that will be merged with the default index settings EsQueue uses to
* create elastcisearch indices
*/
indexSettings?: AnyObject;
}
);
public addJob<P, J = Job<P>>(type: string, payload: P, options: JobOptions): J;
public registerWorker<P, R extends WorkerOutput, W = Worker<P, R>>(
this: void,
type: string,
workerFn: WorkerFn<P, R>,
opts?: Pick<WorkerOptions, Exclude<keyof WorkerOptions, 'logger'>>
): W;
}

View file

@ -0,0 +1,84 @@
/*
* Borrowed from https://github.com/elastic/kibana/tree/master/x-pack/plugins/reporting/server/lib/esqueue
* TODO(mengwei): need to abstract this esqueue as a common library when merging into kibana's main repo.
* /
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { EventEmitter } from 'events';
import { Job } from './job';
import { Worker } from './worker';
import { constants } from './constants';
import { indexTimestamp } from './helpers/index_timestamp';
function omit(obj, keysToOmit) {
return Object.keys(obj).reduce((acc, key) => (
keysToOmit.includes(key) ? acc : { ...acc, [key]: obj[key] }
), {})
}
export class Esqueue extends EventEmitter {
constructor(index, options = {}) {
if (!index) throw new Error('Must specify an index to write to');
super();
this.index = index;
this.settings = {
interval: constants.DEFAULT_SETTING_INTERVAL,
timeout: constants.DEFAULT_SETTING_TIMEOUT,
doctype: constants.DEFAULT_SETTING_DOCTYPE,
dateSeparator: constants.DEFAULT_SETTING_DATE_SEPARATOR,
...omit(options, ['client'])
};
this.client = options.client;
this._logger = options.logger || function () {};
this._workers = [];
this._initTasks().catch((err) => this.emit(constants.EVENT_QUEUE_ERROR, err));
}
_initTasks() {
const initTasks = [
this.client.ping(),
];
return Promise.all(initTasks).catch((err) => {
this._logger(err, ['initTasks', 'error']);
throw err;
});
}
addJob(type, payload, opts = {}) {
const timestamp = indexTimestamp(this.settings.interval, this.settings.dateSeparator);
const index = `${this.index}-${timestamp}`;
const defaults = {
timeout: this.settings.timeout,
};
const options = Object.assign(defaults, opts, {
doctype: this.settings.doctype,
indexSettings: this.settings.indexSettings,
logger: this._logger
});
return new Job(this, index, type, payload, options);
}
registerWorker(type, workerFn, opts) {
const worker = new Worker(this, type, workerFn, { ...opts, logger: this._logger });
this._workers.push(worker);
return worker;
}
getWorkers() {
return this._workers.map((fn) => fn);
}
destroy() {
const workers = this._workers.filter((worker) => worker.destroy());
this._workers = workers;
}
}

View file

@ -0,0 +1,10 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export class CancellationToken {
public on(callback: () => void): void;
public cancel(): void;
}

View file

@ -0,0 +1,30 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export class CancellationToken {
constructor() {
this.isCancelled = false;
this._callbacks = [];
}
on = (callback) => {
if (typeof callback !== 'function') {
throw new Error('Expected callback to be a function');
}
if (this.isCancelled) {
callback();
return;
}
this._callbacks.push(callback);
};
cancel = () => {
this.isCancelled = true;
this._callbacks.forEach(callback => callback());
};
}

View file

@ -0,0 +1,93 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { constants } from '../constants';
const schema = {
meta: {
// We are indexing these properties with both text and keyword fields because that's what will be auto generated
// when an index already exists. This schema is only used when a reporting index doesn't exist. This way existing
// reporting indexes and new reporting indexes will look the same and the data can be queried in the same
// manner.
properties: {
/**
* Type of object that is triggering this report. Should be either search, visualization or dashboard.
* Used for phone home stats only.
*/
objectType: {
type: 'text',
fields: {
keyword: {
type: 'keyword',
ignore_above: 256
}
}
},
/**
* Can be either preserve_layout, print or none (in the case of csv export).
* Used for phone home stats only.
*/
layout: {
type: 'text',
fields: {
keyword: {
type: 'keyword',
ignore_above: 256
}
}
},
}
},
jobtype: { type: 'keyword' },
payload: { type: 'object', enabled: false },
priority: { type: 'byte' },
timeout: { type: 'long' },
process_expiration: { type: 'date' },
created_by: { type: 'keyword' },
created_at: { type: 'date' },
started_at: { type: 'date' },
completed_at: { type: 'date' },
attempts: { type: 'short' },
max_attempts: { type: 'short' },
status: { type: 'keyword' },
output: {
type: 'object',
properties: {
content_type: { type: 'keyword' },
content: { type: 'object', enabled: false }
}
}
};
export function createIndex(client, indexName,
doctype = constants.DEFAULT_SETTING_DOCTYPE,
indexSettings = { }) {
const body = {
settings: {
...constants.DEFAULT_SETTING_INDEX_SETTINGS,
...indexSettings
},
mappings: {
[doctype]: {
properties: schema
}
}
};
return client.indices.exists({
index: indexName,
})
.then((exists) => {
if (!exists) {
return client.indices.create({
index: indexName,
body: body
})
.then(() => true);
}
return exists;
});
}

View file

@ -0,0 +1,26 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export function WorkerTimeoutError(message, props = {}) {
this.name = 'WorkerTimeoutError';
this.message = message;
this.timeout = props.timeout;
this.jobId = props.jobId;
if ("captureStackTrace" in Error) Error.captureStackTrace(this, WorkerTimeoutError);
else this.stack = (new Error()).stack;
}
WorkerTimeoutError.prototype = Object.create(Error.prototype);
export function UnspecifiedWorkerError(message, props = {}) {
this.name = 'UnspecifiedWorkerError';
this.message = message;
this.jobId = props.jobId;
if ("captureStackTrace" in Error) Error.captureStackTrace(this, UnspecifiedWorkerError);
else this.stack = (new Error()).stack;
}
UnspecifiedWorkerError.prototype = Object.create(Error.prototype);

View file

@ -0,0 +1,46 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import moment from 'moment';
export const intervals = [
'year',
'month',
'week',
'day',
'hour',
'minute'
];
export function indexTimestamp(intervalStr, separator = '-') {
if (separator.match(/[a-z]/i)) throw new Error('Interval separator can not be a letter');
const index = intervals.indexOf(intervalStr);
if (index === -1) throw new Error('Invalid index interval: ', intervalStr);
const m = moment();
m.startOf(intervalStr);
let dateString;
switch (intervalStr) {
case 'year':
dateString = 'YYYY';
break;
case 'month':
dateString = `YYYY${separator}MM`;
break;
case 'hour':
dateString = `YYYY${separator}MM${separator}DD${separator}HH`;
break;
case 'minute':
dateString = `YYYY${separator}MM${separator}DD${separator}HH${separator}mm`;
break;
default:
dateString = `YYYY${separator}MM${separator}DD`;
}
return m.format(dateString);
}

View file

@ -0,0 +1,81 @@
/*
* Borrowed from https://github.com/elastic/kibana/blob/master/x-pack/common/poller.js
* /
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import _ from 'lodash';
export class Poller {
constructor(options) {
this.functionToPoll = options.functionToPoll; // Must return a Promise
this.successFunction = options.successFunction || _.noop;
this.errorFunction = options.errorFunction || _.noop;
this.pollFrequencyInMillis = options.pollFrequencyInMillis;
this.trailing = options.trailing || false;
this.continuePollingOnError = options.continuePollingOnError || false;
this.pollFrequencyErrorMultiplier = options.pollFrequencyErrorMultiplier || 1;
this._timeoutId = null;
this._isRunning = false;
}
getPollFrequency() {
return this.pollFrequencyInMillis;
}
_poll() {
return this.functionToPoll()
.then(this.successFunction)
.then(() => {
if (!this._isRunning) {
return;
}
this._timeoutId = setTimeout(this._poll.bind(this), this.pollFrequencyInMillis);
})
.catch(e => {
this.errorFunction(e);
if (!this._isRunning) {
return;
}
if (this.continuePollingOnError) {
this._timeoutId = setTimeout(this._poll.bind(this), this.pollFrequencyInMillis * this.pollFrequencyErrorMultiplier);
} else {
this.stop();
}
});
}
start() {
if (this._isRunning) {
return;
}
this._isRunning = true;
if (this.trailing) {
this._timeoutId = setTimeout(this._poll.bind(this), this.pollFrequencyInMillis);
} else {
this._poll();
}
}
stop() {
if (!this._isRunning) {
return;
}
this._isRunning = false;
clearTimeout(this._timeoutId);
this._timeoutId = null;
}
isRunning() {
return this._isRunning;
}
}

View file

@ -0,0 +1,12 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export { events } from './constants/events';
export { CancellationToken } from './helpers/cancellation_token';
export { Job } from './job';
export { Worker, WorkerOutput } from './worker';
export { Esqueue } from './esqueue';
export { AnyObject, EsClient } from './misc';

View file

@ -0,0 +1,2 @@
export { events } from './constants/events';
export { Esqueue } from './esqueue'

View file

@ -0,0 +1,104 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { EventEmitter } from 'events';
import { events } from './constants/events';
import { Status } from './constants/statuses';
import { Esqueue } from './esqueue';
import { CancellationToken } from './helpers/cancellation_token';
import { AnyObject, EsClient, LogFn } from './misc';
export interface JobOptions {
client?: EsClient;
indexSettings?: string;
doctype?: string;
created_by?: string;
timeout?: number;
max_attempts?: number;
priority?: number;
headers?: {
[key: string]: string;
};
logger?: LogFn;
}
type OptionalPropType<T, P> = P extends keyof T ? T[P] : void;
export class Job<P> extends EventEmitter {
public queue: Esqueue;
public client: EsClient;
public id: string;
public index: string;
public jobtype: string;
public payload: P;
public created_by: string | false; // tslint:disable-line variable-name
public timeout: number;
public maxAttempts: number;
public priority: number;
public doctype: string;
public indexSettings: AnyObject;
public ready: Promise<void>;
constructor(queue: Esqueue, index: string, type: string, payload: P, options?: JobOptions);
/**
* Read the job document out of elasticsearch, includes its current
* status and posisble result.
*/
public get(): Promise<{
// merged in get() method
index: string;
id: string;
type: string;
version: number;
// from doc._source
jobtype: string;
meta: {
objectType: OptionalPropType<P, 'type'>;
layout: OptionalPropType<P, 'layout'>;
};
payload: P;
priority: number;
created_by: string | false;
timeout: number;
process_expiration: string; // use epoch so the job query works
created_at: string;
attempts: number;
max_attempts: number;
status: Status;
}>;
/**
* Get a plain JavaScript representation of the Job object
*/
public toJSON(): {
id: string;
index: string;
type: string;
jobtype: string;
created_by: string | false;
payload: P;
timeout: number;
max_attempts: number;
priority: number;
};
public on(
name: typeof events['EVENT_JOB_CREATED'],
handler: (
info: {
id: string;
type: string;
index: string;
version: number;
}
) => void
): this;
public on(name: typeof events['EVENT_JOB_CREATE_ERROR'], handler: (error: Error) => void): this;
public on(name: string, ...args: any[]): this;
}

View file

@ -0,0 +1,138 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import events from 'events';
import Puid from 'puid';
import { constants } from './constants';
import { createIndex } from './helpers/create_index';
const puid = new Puid();
export class Job extends events.EventEmitter {
constructor(queue, index, type, payload, options = {}) {
if (typeof type !== 'string') throw new Error('Type must be a string');
if (!payload || typeof payload !== 'object') throw new Error('Payload must be a plain object');
super();
this.queue = queue;
this.client = options.client || this.queue.client;
this.id = puid.generate();
this.index = index;
this.jobtype = type;
this.payload = payload;
this.created_by = options.created_by || false;
this.timeout = options.timeout || 10000;
this.maxAttempts = options.max_attempts || 3;
this.priority = Math.max(Math.min(options.priority || 10, 20), -20);
this.doctype = options.doctype || constants.DEFAULT_SETTING_DOCTYPE;
this.indexSettings = options.indexSettings || {};
this.debug = (msg, err) => {
const logger = options.logger || function () {};
const message = `${this.id} - ${msg}`;
const tags = ['job', 'debug'];
if (err) {
logger(`${message}: ${err}`, tags);
return;
}
logger(message, tags);
};
const indexParams = {
index: this.index,
type: this.doctype,
id: this.id,
body: {
jobtype: this.jobtype,
meta: {
// We are copying these values out of payload because these fields are indexed and can be aggregated on
// for tracking stats, while payload contents are not.
objectType: payload.type,
layout: payload.layout ? payload.layout.id : 'none',
},
payload: this.payload,
priority: this.priority,
created_by: this.created_by,
timeout: this.timeout,
process_expiration: new Date(0), // use epoch so the job query works
created_at: new Date(),
attempts: 0,
max_attempts: this.maxAttempts,
status: constants.JOB_STATUS_PENDING,
}
};
if (options.headers) {
indexParams.headers = options.headers;
}
this.ready = createIndex(this.client, this.index, this.doctype, this.indexSettings)
.then(() => this.client.index(indexParams))
.then((doc) => {
this.document = {
id: doc._id,
type: doc._type,
index: doc._index,
version: doc._version,
};
this.debug(`Job created in index ${this.index}`);
return this.client.indices.refresh({
index: this.index
}).then(() => {
this.debug(`Job index refreshed ${this.index}`);
this.emit(constants.EVENT_JOB_CREATED, this.document);
});
})
.catch((err) => {
this.debug('Job creation failed', err);
this.emit(constants.EVENT_JOB_CREATE_ERROR, err);
});
}
emit(name, ...args) {
super.emit(name, ...args);
this.queue.emit(name, ...args);
}
get() {
return this.ready
.then(() => {
return this.client.get({
index: this.index,
type: this.doctype,
id: this.id
});
})
.then((doc) => {
return Object.assign(doc._source, {
index: doc._index,
id: doc._id,
type: doc._type,
version: doc._version,
});
});
}
toJSON() {
return {
id: this.id,
index: this.index,
type: this.doctype,
jobtype: this.jobtype,
created_by: this.created_by,
payload: this.payload,
timeout: this.timeout,
max_attempts: this.maxAttempts,
priority: this.priority
};
}
}

View file

@ -0,0 +1,38 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export interface AnyObject {
[key: string]: any;
}
export interface EsClient {
indices: {
exists(params: AnyObject): Promise<any>;
create(params: AnyObject): Promise<any>;
refresh(params: AnyObject): Promise<any>;
delete(params: AnyObject): Promise<any>;
existsAlias(params: AnyObject): Promise<any>;
getAlias(params: AnyObject): Promise<any>;
putAlias(params: AnyObject): Promise<any>;
deleteAlias(params: AnyObject): Promise<any>;
updateAliases(params: AnyObject): Promise<any>;
getMapping(params: AnyObject): Promise<any>;
};
ping(): Promise<void>;
bulk(params: AnyObject): Promise<any>;
index(params: AnyObject): Promise<any>;
get(params: AnyObject): Promise<any>;
update(params: AnyObject): Promise<any>;
reindex(params: AnyObject): Promise<any>;
search(params: AnyObject): Promise<any>;
delete(params: AnyObject): Promise<any>;
deleteByQuery(params: AnyObject): Promise<any>;
}
export type LogFn = (msg: string | Error, tags: string[]) => void;

View file

@ -0,0 +1,11 @@
{
"name": "@code/esqueue",
"version": "0.0.0",
"private": true,
"license": "Elastic-License",
"types": "./types/index.d.ts",
"dependencies": {
"moment": "^2.20.1",
"puid": "1.0.5"
}
}

View file

@ -0,0 +1,109 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { EventEmitter } from 'events';
import { events } from './constants/events';
import { Esqueue } from './esqueue';
import { CancellationToken } from './helpers/cancellation_token';
import { Job } from './job';
import { AnyObject, EsClient, LogFn } from './misc';
type Handler<A> = (arg: A) => void;
interface JobInfo {
index: string;
type: string;
id: string;
}
interface WorkerInfo {
id: string;
index: string;
jobType: string;
doctype: string;
}
interface ErrorInfo {
error: Error;
worker: WorkerInfo;
job: JobInfo;
}
interface ErrorInfoNoJob {
error: Error;
worker: WorkerInfo;
}
type WorkerOutput<T = any> = {
content: T;
content_type: string;
max_size_reached?: any;
} | void;
export type WorkerFn<P, R extends WorkerOutput> = (
payload: P,
cancellationToken: CancellationToken
) => Promise<R>;
export interface WorkerOptions {
interval: number;
capacity: number;
intervalErrorMultiplier: number;
client?: EsClient;
size?: number;
doctype?: string;
logger?: LogFn;
}
export class Worker<P, O extends WorkerOutput> extends EventEmitter {
public id: string;
public queue: Esqueue;
public client: EsClient;
public jobType: string;
public workerFn: WorkerFn<P, O>;
public checkSize: number;
public doctype: string;
constructor(queue: Esqueue, type: string, workerFn: WorkerFn<P, O>, opts: WorkerOptions);
public destroy(): void;
/**
* Get a plain JavaScript object describing this worker
*/
public toJSON(): {
id: string;
index: string;
jobType: string;
doctype: string;
};
public on(
name: typeof events['EVENT_WORKER_COMPLETE'],
h: Handler<{
job: {
index: string;
type: string;
id: string;
};
output: O;
}>
): this;
public on(
name:
| typeof events['EVENT_WORKER_JOB_CLAIM_ERROR']
| typeof events['EVENT_WORKER_JOB_UPDATE_ERROR']
| typeof events['EVENT_WORKER_JOB_TIMEOUT']
| typeof events['EVENT_WORKER_JOB_EXECUTION_ERROR'],
h: Handler<ErrorInfo>
): this;
public on(name: typeof events['EVENT_WORKER_JOB_SEARCH_ERROR'], h: Handler<ErrorInfoNoJob>): this;
public on(
name: typeof events['EVENT_WORKER_JOB_FAIL'],
h: Handler<{ job: JobInfo; worker: WorkerInfo; output: O }>
): this;
public on(name: string, ...args: any[]): this;
}

View file

@ -0,0 +1,432 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import events from 'events';
import Puid from 'puid';
import moment from 'moment';
import { constants } from './constants';
import { WorkerTimeoutError, UnspecifiedWorkerError } from './helpers/errors';
import { CancellationToken } from './helpers/cancellation_token';
import { Poller } from './helpers/poller';
const puid = new Puid();
function formatJobObject(job) {
return {
index: job._index,
type: job._type,
id: job._id,
// Expose the payload of the job even when the job failed/timeout
payload: job._source.payload.payload,
};
}
export class Worker extends events.EventEmitter {
constructor(queue, type, workerFn, opts) {
if (typeof type !== 'string') throw new Error('type must be a string');
if (typeof workerFn !== 'function') throw new Error('workerFn must be a function');
if (typeof opts !== 'object') throw new Error('opts must be an object');
if (typeof opts.interval !== 'number') throw new Error('opts.interval must be a number');
if (typeof opts.intervalErrorMultiplier !== 'number') throw new Error('opts.intervalErrorMultiplier must be a number');
super();
this.id = puid.generate();
this.queue = queue;
this.client = opts.client || this.queue.client;
this.jobtype = type;
this.workerFn = workerFn;
this.checkSize = opts.size || 10;
this.capacity = opts.capacity || 2;
this.processingJobCount = 0;
this.doctype = opts.doctype || constants.DEFAULT_SETTING_DOCTYPE;
this.debug = (msg, err) => {
const logger = opts.logger || function () {};
const message = `${this.id} - ${msg}`;
const tags = ['worker', 'debug'];
if (err) {
logger(`${message}: ${err.stack ? err.stack : err }`, tags);
return;
}
logger(message, tags);
};
this._running = true;
this.debug(`Created worker for job type ${this.jobtype}`);
this._poller = new Poller({
functionToPoll: () => {
this._processPendingJobs();
// Return an empty promise so that the processing jobs won't block the next poll.
return Promise.resolve();
},
pollFrequencyInMillis: opts.interval,
trailing: true,
continuePollingOnError: true,
pollFrequencyErrorMultiplier: opts.intervalErrorMultiplier,
});
this._startJobPolling();
}
destroy() {
this._running = false;
this._stopJobPolling();
}
toJSON() {
return {
id: this.id,
index: this.queue.index,
jobType: this.jobType,
doctype: this.doctype,
};
}
emit(name, ...args) {
super.emit(name, ...args);
this.queue.emit(name, ...args);
}
_formatErrorParams(err, job) {
const response = {
error: err,
worker: this.toJSON(),
};
if (job) response.job = formatJobObject(job);
return response;
}
_claimJob(job) {
const m = moment();
const startTime = m.toISOString();
const expirationTime = m.add(job._source.timeout).toISOString();
const attempts = job._source.attempts + 1;
if (attempts > job._source.max_attempts) {
const msg = (!job._source.output) ? `Max attempts reached (${job._source.max_attempts})` : false;
return this._failJob(job, msg)
.then(() => false);
}
const doc = {
attempts: attempts,
started_at: startTime,
process_expiration: expirationTime,
status: constants.JOB_STATUS_PROCESSING,
};
return this.client.update({
index: job._index,
type: job._type,
id: job._id,
version: job._version,
body: { doc }
})
.then((response) => {
const updatedJob = {
...job,
...response
};
updatedJob._source = {
...job._source,
...doc
};
return updatedJob;
})
.catch((err) => {
if (err.statusCode === 409) return true;
this.debug(`_claimJob failed on job ${job._id}`, err);
this.emit(constants.EVENT_WORKER_JOB_CLAIM_ERROR, this._formatErrorParams(err, job));
return false;
});
}
_failJob(job, output = false) {
this.debug(`Failing job ${job._id}`);
const completedTime = moment().toISOString();
const docOutput = this._formatOutput(output);
const doc = {
status: constants.JOB_STATUS_FAILED,
completed_at: completedTime,
output: docOutput
};
this.emit(constants.EVENT_WORKER_JOB_FAIL, {
job: formatJobObject(job),
worker: this.toJSON(),
output: docOutput,
});
return this.client.update({
index: job._index,
type: job._type,
id: job._id,
version: job._version,
body: { doc }
})
.then(() => true)
.catch((err) => {
if (err.statusCode === 409) return true;
this.debug(`_failJob failed to update job ${job._id}`, err);
this.emit(constants.EVENT_WORKER_FAIL_UPDATE_ERROR, this._formatErrorParams(err, job));
return false;
});
}
_cancelJob(job) {
this.debug(`Cancelling job ${job._id}`);
const completedTime = moment().toISOString();
const doc = {
status: constants.JOB_STATUS_CANCELLED,
completed_at: completedTime,
}
return this.client.update({
index: job._index,
type: job._type,
id: job._id,
version: job._version,
body: { doc }
})
.then(() => true)
.catch((err) => {
if (err.statusCode === 409) return true;
this.debug(`_cancelJob failed to update job ${job._id}`, err);
this.emit(constants.EVENT_WORKER_FAIL_UPDATE_ERROR, this._formatErrorParams(err, job));
return false;
});
}
_formatOutput(output) {
const unknownMime = false;
const defaultOutput = null;
const docOutput = {};
if (typeof output === 'object' && output.content) {
docOutput.content = output.content;
docOutput.content_type = output.content_type || unknownMime;
docOutput.max_size_reached = output.max_size_reached;
} else {
docOutput.content = output || defaultOutput;
docOutput.content_type = unknownMime;
}
return docOutput;
}
_performJob(job) {
this.debug(`Starting job ${job._id}`);
const workerOutput = new Promise((resolve, reject) => {
// run the worker's workerFn
let isResolved = false;
const cancellationToken = new CancellationToken();
cancellationToken.on(() => {
this._cancelJob(job);
});
this.processingJobCount += 1;
Promise.resolve(this.workerFn.call(null, job._source.payload, cancellationToken))
.then((res) => {
isResolved = true;
this.processingJobCount -= 1;
resolve(res);
})
.catch((err) => {
isResolved = true;
this.processingJobCount -= 1;
reject(err);
});
// fail if workerFn doesn't finish before timeout
setTimeout(() => {
if (isResolved) return;
cancellationToken.cancel();
this.processingJobCount -= 1;
this.debug(`Timeout processing job ${job._id}`);
reject(new WorkerTimeoutError(`Worker timed out, timeout = ${job._source.timeout}`, {
timeout: job._source.timeout,
jobId: job._id,
}));
}, job._source.timeout);
});
return workerOutput.then((output) => {
// job execution was successful
this.debug(`Completed job ${job._id}`);
const completedTime = moment().toISOString();
const docOutput = this._formatOutput(output);
const doc = {
status: constants.JOB_STATUS_COMPLETED,
completed_at: completedTime,
output: docOutput
};
return this.client.update({
index: job._index,
type: job._type,
id: job._id,
version: job._version,
body: { doc }
})
.then(() => {
const eventOutput = {
job: formatJobObject(job),
output: docOutput,
};
this.emit(constants.EVENT_WORKER_COMPLETE, eventOutput);
})
.catch((err) => {
if (err.statusCode === 409) return false;
this.debug(`Failure saving job output ${job._id}`, err);
this.emit(constants.EVENT_WORKER_JOB_UPDATE_ERROR, this._formatErrorParams(err, job));
});
}, (jobErr) => {
if (!jobErr) {
jobErr = new UnspecifiedWorkerError('Unspecified worker error', {
jobId: job._id,
});
}
// job execution failed
if (jobErr.name === 'WorkerTimeoutError') {
this.debug(`Timeout on job ${job._id}`);
this.emit(constants.EVENT_WORKER_JOB_TIMEOUT, this._formatErrorParams(jobErr, job));
return;
// append the jobId to the error
} else {
try {
Object.assign(jobErr, { jobId: job._id });
} catch (e) {
// do nothing if jobId can not be appended
}
}
this.debug(`Failure occurred on job ${job._id}`, jobErr);
this.emit(constants.EVENT_WORKER_JOB_EXECUTION_ERROR, this._formatErrorParams(jobErr, job));
return this._failJob(job, (jobErr.toString) ? jobErr.toString() : false);
});
}
_startJobPolling() {
if (!this._running) {
return;
}
this._poller.start();
}
_stopJobPolling() {
this._poller.stop();
}
_processPendingJobs() {
return this._getPendingJobs()
.then((jobs) => {
return this._claimPendingJobs(jobs);
});
}
_claimPendingJobs(jobs) {
if (!jobs || jobs.length === 0) return;
let claimed = 0;
return jobs.reduce((chain, job) => {
return chain.then((claimedJobs) => {
// Apply capacity control to make sure there won't be more jobs processing than the capacity.
if (claimed === (this.capacity - this.processingJobCount)) return claimedJobs;
return this._claimJob(job)
.then((claimResult) => {
if (claimResult !== false) {
claimed += 1;
claimedJobs.push(claimResult);
return claimedJobs;
}
});
});
}, Promise.resolve([]))
.then((claimedJobs) => {
if (!claimedJobs || claimedJobs.length === 0) {
this.debug(`All ${jobs.length} jobs already claimed`);
return;
}
this.debug(`Claimed ${claimedJobs.size} jobs`);
return Promise.all(claimedJobs.map((job) => {
return this._performJob(job);
}));
})
.catch((err) => {
this.debug('Error claiming jobs', err);
});
}
_getPendingJobs() {
const nowTime = moment().toISOString();
const query = {
_source: {
excludes: [ 'output.content' ]
},
query: {
constant_score: {
filter: {
bool: {
filter: { term: { jobtype: this.jobtype } },
should: [
{ term: { status: 'pending' } },
{ bool: {
filter: [
{ term: { status: 'processing' } },
{ range: { process_expiration: { lte: nowTime } } }
] }
}
]
}
}
}
},
sort: [
{ priority: { order: 'asc' } },
{ created_at: { order: 'asc' } }
],
size: this.checkSize
};
return this.client.search({
index: `${this.queue.index}-*`,
type: this.doctype,
version: true,
body: query
})
.then((results) => {
const jobs = results.hits.hits;
if (jobs.length > 0) {
this.debug(`${jobs.length} outstanding jobs returned`);
}
return jobs;
})
.catch((err) => {
// ignore missing indices errors
if (err && err.status === 404) return [];
this.debug('job querying failed', err);
this.emit(constants.EVENT_WORKER_JOB_SEARCH_ERROR, this._formatErrorParams(err));
throw err;
});
}
}

View file

@ -0,0 +1,13 @@
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1
moment@^2.20.1:
version "2.22.2"
resolved "https://registry.yarnpkg.com/moment/-/moment-2.22.2.tgz#3c257f9839fc0e93ff53149632239eb90783ff66"
integrity sha1-PCV/mDn8DpP/UxSWMiOeuQeD/2Y=
puid@1.0.5:
version "1.0.5"
resolved "https://registry.yarnpkg.com/puid/-/puid-1.0.5.tgz#8d387bf05fb239c5e6f45902c49470084009638c"
integrity sha1-jTh78F+yOcXm9FkCxJRwCEAJY4w=

View file

@ -0,0 +1,3 @@
#!/usr/bin/env node
require('../../../../../kibana/src/setup_node_env');
require('../check_all_filenames');

View file

@ -0,0 +1,3 @@
#!/usr/bin/env node
require('../../../../../kibana/src/setup_node_env');
require('../check_precommit_filenames');

View file

@ -0,0 +1,21 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { resolve } from 'path';
import globby from 'globby';
import { checkFiles } from './check_files';
import { File } from './file';
import { run } from './run';
run(async log => {
const paths = await globby(['**/*', '!**/node_modules/**'], {
cwd: resolve(__dirname, '../../'),
});
return checkFiles(log, paths.map(path => new File(path)));
});

View file

@ -0,0 +1,128 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { basename, relative } from 'path';
import { ToolingLog } from '@kbn/dev-utils';
import chalk from 'chalk';
import minimatch from 'minimatch';
import { IGNORE_DIRECTORY_GLOBS, IGNORE_FILE_GLOBS, KEBAB_CASE_DIRECTORY_GLOBS } from './config';
import { File } from './file';
const NON_SNAKE_CASE_RE = /[A-Z \-]/;
const NON_KEBAB_CASE_RE = /[A-Z \_]/;
function listPaths(paths: string[]) {
return paths.map(path => ` - ${path}`).join('\n');
}
function matchesAnyGlob(path: string, globs: string[]) {
return globs.some(glob =>
minimatch(path, glob, {
dot: true,
})
);
}
/**
* IGNORE_DIRECTORY_GLOBS patterns match directories which should
* be ignored from casing validation. When one of the parent directories
* of a file matches these rules this function strips it from the
* path that is validated.
*
* if `file = new File('foo/bar/BAZ/index.js')` and `/foo/bar/BAZ`
* is matched by an `IGNORE_DIRECTORY_GLOBS` pattern then this
* function will return 'index.js' and only that part of the path
* will be validated.
*
* @param {File} file
* @return {string} pathToCheck
*/
function getPathWithoutIgnoredParents(file: File) {
for (const parent of file.getRelativeParentDirs()) {
if (matchesAnyGlob(parent, IGNORE_DIRECTORY_GLOBS)) {
return relative(parent, file.getRelativePath());
}
}
return file.getRelativePath();
}
/**
* Check for directories in the passed File objects which match the
* KEBAB_CASE_DIRECTORY_GLOBS and ensure that those directories use
* kebab case
*/
function checkForKebabCase(log: ToolingLog, files: File[]) {
const errorPaths = files
.reduce(
(acc, file) => {
const parents = file.getRelativeParentDirs();
return acc.concat(
parents.filter(
parent =>
matchesAnyGlob(parent, KEBAB_CASE_DIRECTORY_GLOBS) &&
NON_KEBAB_CASE_RE.test(basename(parent))
)
);
},
[] as string[]
)
.reduce((acc, path) => (acc.includes(path) ? acc : acc.concat(path)), [] as string[]);
if (errorPaths.length) {
log.error(`These directories MUST use kebab-case.\n${listPaths(errorPaths)}`);
return false;
}
return true;
}
/**
* Check that all passed File objects are using valid casing. Every
* file SHOULD be using snake_case but some files are allowed to stray
* based on casing_check_config.
*/
function checkForSnakeCase(log: ToolingLog, files: File[]) {
const errorPaths: string[] = [];
const warningPaths: string[] = [];
files.forEach(file => {
const path = file.getRelativePath();
const ignored = matchesAnyGlob(path, IGNORE_FILE_GLOBS);
if (ignored) {
log.debug('%j ignored', file);
return;
}
const pathToValidate = getPathWithoutIgnoredParents(file);
const invalid = NON_SNAKE_CASE_RE.test(pathToValidate);
if (!invalid) {
log.debug('%j uses valid casing', file);
} else {
const ignoredParent = file.getRelativePath().slice(0, -pathToValidate.length);
errorPaths.push(`${chalk.dim(ignoredParent)}${pathToValidate}`);
}
});
if (warningPaths.length) {
log.warning(`Filenames SHOULD be snake_case.\n${listPaths(warningPaths)}`);
}
if (errorPaths.length) {
log.error(`Filenames MUST use snake_case.\n${listPaths(errorPaths)}`);
return false;
}
return true;
}
export function checkFiles(log: ToolingLog, files: File[]) {
return checkForKebabCase(log, files) && checkForSnakeCase(log, files);
}

View file

@ -0,0 +1,16 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { resolve } from 'path';
import { checkFiles } from './check_files';
import { File } from './file';
import { run } from './run';
run(log => {
const files = process.argv.slice(2).map(path => new File(resolve(path)));
return checkFiles(log, files);
});

View file

@ -0,0 +1,42 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
/**
* These patterns are used to identify files that are not supposed
* to be snake_case because their names are determined by other
* systems or rules.
*
* @type {Array}
*/
export const IGNORE_FILE_GLOBS = ['types/**/*', '**/{webpackShims,__mocks__}/**/*'];
/**
* These patterns are matched against directories and indicate
* folders that must use kebab case.
*
* @type {Array}
*/
export const KEBAB_CASE_DIRECTORY_GLOBS = ['packages/*'];
/**
* These patterns are matched against directories and indicate
* explicit folders that are NOT supposed to use snake_case.
*
* When a file in one of these directories is checked, the directory
* matched by these patterns is removed from the path before
* the casing check so that the files casing is still checked. This
* allows folders like `src/ui/public/flot-charts` to exist, which
* is named to match the npm package and follow the kebab-casing
* convention there, but allows us to still verify that files within
* that directory use snake_case
*
* @type {Array}
*/
export const IGNORE_DIRECTORY_GLOBS = [
...KEBAB_CASE_DIRECTORY_GLOBS,
'**/webpackShims',
'packages/*',
];

View file

@ -0,0 +1,68 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { dirname, extname, join, relative, resolve, sep } from 'path';
const PROJECT_ROOT = resolve(__dirname, '../../');
export class File {
private path: string;
private relativePath: string;
private ext: string;
constructor(path: string) {
this.path = resolve(path);
this.relativePath = relative(PROJECT_ROOT, this.path);
this.ext = extname(this.path);
}
public getAbsolutePath() {
return this.path;
}
public getRelativePath() {
return this.relativePath;
}
public isJs() {
return this.ext === '.js';
}
public isTypescript() {
return this.ext === '.ts' || this.ext === '.tsx';
}
public isFixture() {
return this.relativePath.split(sep).includes('__fixtures__');
}
public getRelativeParentDirs() {
const parents: string[] = [];
while (true) {
// NOTE: resolve() produces absolute paths, so we have to use join()
const parent = parents.length
? join(parents[parents.length - 1], '..')
: dirname(this.relativePath);
if (parent === '..' || parent === '.') {
break;
} else {
parents.push(parent);
}
}
return parents;
}
public toString() {
return this.relativePath;
}
public toJSON() {
return this.relativePath;
}
}

View file

@ -0,0 +1,7 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export { runPrecommitHook } from './run_precommit_hook';

View file

@ -0,0 +1,18 @@
{
"name": "@code/filename-check",
"version": "0.0.0",
"private": true,
"bin": {
"check_all_filenames": "bin/check_all_filenames",
"check_precommit_filenames": "bin/check_precommit_filenames"
},
"dependencies": {
"@kbn/dev-utils": "link:../../../../kibana/packages/kbn-dev-utils",
"chalk": "^2.4.1",
"globby": "^8.0.1",
"minimatch": "^3.0.4"
},
"devDependencies": {
"@types/globby": "^8.0.0"
}
}

View file

@ -0,0 +1,23 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { ToolingLog } from '@kbn/dev-utils';
const log = new ToolingLog({
level: 'info',
writeTo: process.stdout,
});
export async function run(checker: (log: ToolingLog) => boolean | Promise<boolean>) {
try {
if (!(await checker(log))) {
process.exit(1);
}
} catch (error) {
log.error(error);
process.exit(1);
}
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,18 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
import { GitBlame } from '../../common/git_blame';
export interface LoadBlamePayload {
repoUri: string;
revision: string;
path: string;
}
export const loadBlame = createAction<LoadBlamePayload>('LOAD BLAME');
export const loadBlameSuccess = createAction<GitBlame[]>('LOAD BLAME SUCCESS');
export const loadBlameFailed = createAction<Error>('LOAD BLAME FAILED');

View file

@ -0,0 +1,12 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
import { CommitDiff } from '../../common/git_diff';
export const loadCommit = createAction<string>('LOAD COMMIT');
export const loadCommitSuccess = createAction<CommitDiff>('LOAD COMMIT SUCCESS');
export const loadCommitFailed = createAction<Error>('LOAD COMMIT FAILED');

View file

@ -0,0 +1,34 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { Range } from 'monaco-editor';
import { createAction } from 'redux-actions';
import { Hover, Position, TextDocumentPositionParams } from 'vscode-languageserver';
export interface GroupedRepoReferences {
repo: string;
files: GroupedFileReferences[];
}
export interface GroupedFileReferences {
uri: string;
file: string;
language: string;
code: string;
lineNumbers: string[];
repo: string;
revision: string;
highlights: Range[];
}
export const findReferences = createAction<TextDocumentPositionParams>('FIND REFERENCES');
export const findReferencesSuccess = createAction<GroupedRepoReferences[]>(
'FIND REFERENCES SUCCESS'
);
export const findReferencesFailed = createAction<Error>('FIND REFERENCES ERROR');
export const closeReferences = createAction('CLOSE REFERENCES');
export const hoverResult = createAction<Hover>('HOVER RESULT');
export const revealPosition = createAction<Position>('REVEAL POSITION');

View file

@ -0,0 +1,68 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
import { FileTree } from '../../model';
import { CommitInfo, ReferenceInfo } from '../../model/commit';
export interface FetchRepoPayload {
uri: string;
}
export interface FetchRepoPayloadWithRevision extends FetchRepoPayload {
revision: string;
}
export interface FetchFilePayload extends FetchRepoPayloadWithRevision {
path: string;
}
export interface FetchRepoTreePayload extends FetchFilePayload {
limit?: number;
}
export interface FetchFileResponse {
payload: FetchFilePayload;
isNotFound?: boolean;
content?: string;
lang?: string;
isImage?: boolean;
url?: string;
}
export interface RepoTreePayload {
tree: FileTree;
path: string;
}
export const fetchRepoTree = createAction<FetchRepoTreePayload>('FETCH REPO TREE');
export const fetchRepoTreeSuccess = createAction<RepoTreePayload>('FETCH REPO TREE SUCCESS');
export const fetchRepoTreeFailed = createAction<Error>('FETCH REPO TREE FAILED');
export const resetRepoTree = createAction('CLEAR REPO TREE');
export const closeTreePath = createAction<string>('CLOSE TREE PATH');
export const openTreePath = createAction<string>('OPEN TREE PATH');
export const fetchRepoBranches = createAction<FetchRepoPayload>('FETCH REPO BRANCHES');
export const fetchRepoBranchesSuccess = createAction<ReferenceInfo[]>(
'FETCH REPO BRANCHES SUCCESS'
);
export const fetchRepoBranchesFailed = createAction<Error>('FETCH REPO BRANCHES FAILED');
export const fetchRepoCommits = createAction<FetchRepoPayloadWithRevision>('FETCH REPO COMMITS');
export const fetchRepoCommitsSuccess = createAction<CommitInfo[]>('FETCH REPO COMMITS SUCCESS');
export const fetchRepoCommitsFailed = createAction<Error>('FETCH REPO COMMITS FAILED');
export const fetchFile = createAction<FetchFilePayload>('FETCH FILE');
export const fetchFileSuccess = createAction<FetchFileResponse>('FETCH FILE SUCCESS');
export const fetchFileFailed = createAction<Error>('FETCH FILE ERROR');
export const fetchDirectory = createAction<FetchRepoTreePayload>('FETCH REPO DIR');
export const fetchDirectorySuccess = createAction<FileTree>('FETCH REPO DIR SUCCESS');
export const fetchDirectoryFailed = createAction<Error>('FETCH REPO DIR FAILED');
export const setNotFound = createAction<boolean>('SET NOT FOUND');
export const fetchTreeCommits = createAction<FetchFilePayload>('FETCH TREE COMMITS');
export const fetchTreeCommitsSuccess = createAction<{ path: string; commits: CommitInfo[] }>(
'FETCH TREE COMMITS SUCCESS'
);
export const fetchTreeCommitsFailed = createAction<Error>('FETCH TREE COMMITS FAILED');

View file

@ -0,0 +1,26 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
export * from './repository';
export * from './search';
export * from './file';
export * from './structure';
export * from './editor';
export * from './user';
export * from './commit';
export * from './status';
export interface Match {
isExact?: boolean;
params: { [key: string]: string };
path: string;
url: string;
location: Location;
}
export const routeChange = createAction<Match>('CODE SEARCH ROUTE CHANGE');

View file

@ -0,0 +1,34 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
import { Repository } from '../../model';
import { RepoConfigs } from '../../model/workspace';
export const fetchRepos = createAction('FETCH REPOS');
export const fetchReposSuccess = createAction<Repository[]>('FETCH REPOS SUCCESS');
export const fetchReposFailed = createAction<Error>('FETCH REPOS FAILED');
export const deleteRepo = createAction<string>('DELETE REPOS');
export const deleteRepoSuccess = createAction<string>('DELETE REPOS SUCCESS');
export const deleteRepoFailed = createAction<Error>('DELETE REPOS FAILED');
export const indexRepo = createAction<string>('INDEX REPOS');
export const indexRepoSuccess = createAction<string>('INDEX REPOS SUCCESS');
export const indexRepoFailed = createAction<Error>('INDEX REPOS FAILED');
export const importRepo = createAction<string>('IMPORT REPOS');
export const importRepoSuccess = createAction<string>('IMPORT REPOS SUCCESS');
export const importRepoFailed = createAction<Error>('IMPORT REPOS FAILED');
export const fetchRepoConfigs = createAction('FETCH REPO CONFIGS');
export const fetchRepoConfigSuccess = createAction<RepoConfigs>('FETCH REPO CONFIGS SUCCESS');
export const fetchRepoConfigFailed = createAction<Error>('FETCH REPO CONFIGS FAILED');
export const initRepoCommand = createAction<string>('INIT REPO CMD');
export const gotoRepo = createAction<string>('GOTO REPO');

View file

@ -0,0 +1,37 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
export interface DocumentSearchPayload {
query: string;
page?: number;
languages?: string;
repositories?: string;
}
export interface RepositorySearchPayload {
query: string;
}
// For document search page
export const documentSearch = createAction<DocumentSearchPayload>('DOCUMENT SEARCH');
export const documentSearchSuccess = createAction<string>('DOCUMENT SEARCH SUCCESS');
export const documentSearchFailed = createAction<string>('DOCUMENT SEARCH FAILED');
// For repository search page
export const repositorySearch = createAction<RepositorySearchPayload>('REPOSITORY SEARCH');
export const repositorySearchSuccess = createAction<string>('REPOSITORY SEARCH SUCCESS');
export const repositorySearchFailed = createAction<Error>('REPOSITORY SEARCH FAILED');
export const changeSearchScope = createAction<string>('CHANGE SEARCH SCOPE');
// For repository search typeahead
export const repositorySearchQueryChanged = createAction<RepositorySearchPayload>(
'REPOSITORY SEARCH QUERY CHANGED'
);
export const repositoryTypeaheadSearchSuccess = createAction<string>('REPOSITORY SEARCH SUCCESS');
export const repositoryTypeaheadSearchFailed = createAction<string>('REPOSITORY SEARCH FAILED');

View file

@ -0,0 +1,15 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
export const loadStatus = createAction<string>('LOAD STATUS');
export const loadStatusSuccess = createAction<any>('LOAD STATUS SUCCESS');
export const loadStatusFailed = createAction<string>('LOAD STATUS FAILED');
export const loadRepo = createAction<string>('LOAD REPO');
export const loadRepoSuccess = createAction<any>('LOAD REPO SUCCESS');
export const loadRepoFailed = createAction<any>('LOAD REPO FAILED');

View file

@ -0,0 +1,12 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
import { ResponseMessage } from 'vscode-jsonrpc/lib/messages';
export const loadStructure = createAction<string>('LOAD STRUCTURE');
export const loadStructureSuccess = createAction<ResponseMessage>('LOAD STRUCTURE SUCCESS');
export const loadStructureFailed = createAction<Error>('LOAD STRUCTURE FAILED');

View file

@ -0,0 +1,11 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import { createAction } from 'redux-actions';
export const loadUserConfig = createAction('USER CONFIG');
export const loadUserConfigSuccess = createAction<string>('USER CONFIG SUCCESS');
export const loadUserConfigFailed = createAction<string>('USER CONFIG FAILED');

View file

@ -0,0 +1,48 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import React from 'react';
import { render, unmountComponentAtNode } from 'react-dom';
import { Provider } from 'react-redux';
import 'ui/autoload/styles';
import chrome from 'ui/chrome';
import { uiModules } from 'ui/modules';
import { App } from './components/app';
import { bindSocket } from './socket';
import { store } from './stores';
// Bind the web socket client.
bindSocket(store);
const app = uiModules.get('apps/code');
app.config(($locationProvider: any) => {
$locationProvider.html5Mode({
enabled: false,
requireBase: false,
rewriteLinks: false,
});
});
app.config((stateManagementConfigProvider: any) => stateManagementConfigProvider.disable());
function RootController($scope: any, $element: any, $http: any) {
const domNode = $element[0];
// render react to DOM
render(
<Provider store={store}>
<App />
</Provider>,
domNode
);
// unmount react on controller destroy
$scope.$on('$destroy', () => {
unmountComponentAtNode(domNode);
});
}
chrome.setRootController('code', RootController);

View file

@ -0,0 +1,25 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
export enum SearchScope {
repository = 'repository',
default = 'default',
symbol = 'symbol',
}
export enum PathTypes {
blob = 'blob',
tree = 'tree',
}
export interface MainRouteParams {
path: string;
repo: string;
resource: string;
org: string;
revision: string;
pathType: PathTypes;
}

View file

@ -0,0 +1,41 @@
.repoItem {
border: 1px solid grey;
margin: 0;
padding: 1rem;
}
.importModal {
padding: 2.5rem;
min-width: 60rem;
min-height: 40rem;
}
.importModalTitle {
margin-bottom: 1rem;
}
.tabContent {
flex-direction: row;
margin-bottom: 0.5rem;
}
.addressInputLabel {
line-height: 40px;
}
.importModalInput {
margin-left: 1rem;
}
.importModalButton {
margin-left: 2rem;
}
.searchBox {
margin: 0;
}
.addRepoButton {
margin: auto;
display: block;
}

View file

@ -0,0 +1,383 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import React from 'react';
import { connect } from 'react-redux';
import {
EuiButton,
EuiButtonIcon,
EuiCallOut,
EuiFieldText,
EuiFlexGroup,
EuiFlexItem,
EuiModal,
EuiOverlayMask,
EuiPage,
EuiPageBody,
EuiPageContent,
EuiPageContentBody,
EuiPageContentHeader,
EuiPageContentHeaderSection,
EuiProgress,
EuiSearchBar,
EuiSideNav,
EuiTitle,
} from '@elastic/eui';
import { Link } from 'react-router-dom';
import styled from 'styled-components';
import { RepoConfigs, Repository } from '../../../model';
import { deleteRepo, importRepo, indexRepo, initRepoCommand } from '../../actions';
import { RootState } from '../../reducers';
import { CallOutType } from '../../reducers/repository';
import { FlexGrowContainer } from '../../styled_components/flex_grow_container';
import { RelativeContainer } from '../../styled_components/relative_container';
import { InlineProgressContainer } from './inline_progress_container';
const callOutTitle = {
[CallOutType.danger]: 'Sorry, there was an error',
[CallOutType.success]: 'Successfully Imported!',
};
enum Tabs {
GitAddress,
GitHub,
}
interface Props {
repositories: Repository[];
importLoading: boolean;
deleteRepo: (uri: string) => void;
indexRepo: (uri: string) => void;
importRepo: (uri: string) => void;
initRepoCommand: (uri: string) => void;
repoConfigs?: RepoConfigs;
showCallOut: boolean;
callOutMessage?: string;
callOutType?: CallOutType;
status: { [key: string]: any };
isAdmin: boolean;
}
interface State {
isModalVisible: boolean;
activeTab: Tabs;
importRepoAddress: string;
searchQuery: any;
}
interface RepositoryItemProps {
repoName: string;
repoURI: string;
deleteRepo: () => void;
indexRepo: () => void;
initRepoCommand: () => void;
hasInitCmd?: boolean;
status: any;
isAdmin: boolean;
}
const Caption = styled.div`
position: absolute;
top: 0;
left: 0;
right: 0;
text-align: center;
line-height: 16px;
`;
const Progress = props => (
<InlineProgressContainer>
<RelativeContainer>
<EuiProgress size="l" value={props.progress} max={100} />
<Caption>{props.children}</Caption>
</RelativeContainer>
</InlineProgressContainer>
);
const RepositoryItem = (props: RepositoryItemProps) => {
const initRepoButton = (
<EuiButtonIcon iconType="play" aria-label="run init command" onClick={props.initRepoCommand} />
);
const progressPrompt = props.status
? `${
props.status.progress < 0
? 'Clone Failed'
: `Cloning...${props.status.progress.toFixed(2)}%`
}`
: '';
const progress = props.status &&
props.status.progress &&
props.status.progress < 100 && (
<Progress progress={props.status.progress}>{progressPrompt}</Progress>
);
const adminButtons = props.isAdmin ? (
<div>
{props.hasInitCmd && initRepoButton}
<EuiButtonIcon iconType="indexSettings" aria-label="settings" />
<EuiButtonIcon iconType="indexOpen" aria-label="index" onClick={props.indexRepo} />
<EuiButtonIcon iconType="trash" aria-label="delete" onClick={props.deleteRepo} />
</div>
) : null;
return (
<EuiFlexGroup className="repoItem" wrap={true} justifyContent="spaceBetween">
<EuiFlexItem>
<EuiFlexGroup direction="column" justifyContent="spaceBetween">
<div>
<Link to={`/${props.repoURI}`}>{props.repoName}</Link>
</div>
<div>
<a href={`//${props.repoURI}`} target="__blank">
{props.repoURI}
</a>
</div>
</EuiFlexGroup>
</EuiFlexItem>
{progress}
<EuiFlexItem grow={false}>{adminButtons}</EuiFlexItem>
</EuiFlexGroup>
);
};
const initialQuery = EuiSearchBar.Query.MATCH_ALL;
class AdminPage extends React.PureComponent<Props, State> {
public state = {
isModalVisible: false,
activeTab: Tabs.GitAddress,
importRepoAddress: '',
searchQuery: initialQuery,
};
public getSideNavItems = () => {
if (this.state.activeTab === Tabs.GitAddress) {
return [
{
isSelected: true,
name: 'Git Address',
id: Tabs.GitAddress,
onClick: this.getTabClickHandler(Tabs.GitAddress),
},
{
isSelected: false,
name: 'GitHub',
id: Tabs.GitHub,
onClick: this.getTabClickHandler(Tabs.GitHub),
},
];
} else if (this.state.activeTab === Tabs.GitHub) {
return [
{
isSelected: false,
name: 'Git Address',
id: Tabs.GitAddress,
onClick: this.getTabClickHandler(Tabs.GitAddress),
},
{
isSelected: true,
name: 'GitHub',
id: Tabs.GitHub,
onClick: this.getTabClickHandler(Tabs.GitHub),
},
];
} else {
throw new Error('Unknown Tab');
}
};
public onImportAddressChange = (e: React.MouseEvent<HTMLInputElement>) => {
this.setState({ importRepoAddress: e.target.value });
};
public importRepo = () => {
this.props.importRepo(this.state.importRepoAddress);
this.setState({ importRepoAddress: '' });
};
public getTabContent = () => {
if (this.state.activeTab === Tabs.GitAddress) {
return (
<React.Fragment>
<label className="addressInputLabel">Git Address:</label>
<EuiFieldText
className="importModalInput"
placeholder=""
value={this.state.importRepoAddress}
onChange={this.onImportAddressChange}
aria-label="Use aria labels when no actual label is in use"
/>
<EuiButton
onClick={this.importRepo}
isLoading={this.props.importLoading}
className="importModalButton"
>
Add
</EuiButton>
</React.Fragment>
);
} else if (this.state.activeTab === Tabs.GitHub) {
return null;
} else {
throw new Error('Unknown Tab');
}
};
public getTabClickHandler = (tab: Tabs) => () => {
this.setState({ activeTab: tab });
};
public openModal = () => {
this.setState({ isModalVisible: true });
};
public closeModal = () => {
this.setState({ isModalVisible: false });
};
public getDeleteRepoHandler = (uri: string) => () => {
this.props.deleteRepo(uri);
};
public getIndexRepoHandler = (uri: string) => () => {
this.props.indexRepo(uri);
};
public onSearchQueryChange = (q: any) => {
this.setState({
searchQuery: q.query,
});
};
public filterRepos = () => {
const { text } = this.state.searchQuery;
if (text) {
return this.props.repositories.filter(repo =>
repo.uri.toLowerCase().includes(text.toLowerCase())
);
} else {
return this.props.repositories;
}
};
public render() {
const repos = this.filterRepos();
const repositoriesCount = repos.length;
const items = this.getSideNavItems();
const { callOutMessage, status, showCallOut, callOutType, isAdmin } = this.props;
const callOut = showCallOut && (
<EuiCallOut title={callOutTitle[callOutType!]} color={callOutType} iconType="cross">
{callOutMessage}
</EuiCallOut>
);
const importRepositoryModal = (
<EuiOverlayMask>
<EuiModal onClose={this.closeModal} className="importModal">
<EuiTitle size="s" className="importModalTitle">
<h1>Import Repository</h1>
</EuiTitle>
<EuiFlexGroup>
<EuiFlexItem grow={false}>
<EuiSideNav items={items} />
</EuiFlexItem>
<FlexGrowContainer>
<EuiFlexItem className="tabContent">{this.getTabContent()}</EuiFlexItem>
{callOut}
</FlexGrowContainer>
</EuiFlexGroup>
</EuiModal>
</EuiOverlayMask>
);
const repoList = repos.map(repo => (
<RepositoryItem
key={repo.uri}
repoName={repo.name || ''}
repoURI={repo.uri}
deleteRepo={this.getDeleteRepoHandler(repo.uri)}
indexRepo={this.getIndexRepoHandler(repo.uri)}
initRepoCommand={this.props.initRepoCommand.bind(this, repo.uri)}
hasInitCmd={this.hasInitCmd(repo)}
status={status[repo.uri]}
isAdmin={isAdmin}
/>
));
return (
<EuiPage>
<EuiPageBody>
<EuiPageContent>
<EuiPageContentHeader>
<EuiPageContentHeaderSection>
<EuiTitle>
<h2>{repositoriesCount} repositories</h2>
</EuiTitle>
</EuiPageContentHeaderSection>
<EuiPageContentHeaderSection>
<EuiFlexGroup>
<EuiFlexItem>
<EuiSearchBar className="searchBox" onChange={this.onSearchQueryChange} />
</EuiFlexItem>
<EuiFlexItem grow={false}>
<EuiButtonIcon
className="addRepoButton"
onClick={this.openModal}
iconType="plusInCircle"
aria-label="add"
/>
</EuiFlexItem>
</EuiFlexGroup>
</EuiPageContentHeaderSection>
</EuiPageContentHeader>
<EuiPageContentBody>
<div>{repoList}</div>
</EuiPageContentBody>
</EuiPageContent>
</EuiPageBody>
{this.state.isModalVisible && importRepositoryModal}
</EuiPage>
);
}
private hasInitCmd(repo: Repository) {
if (this.props.repoConfigs) {
const config = this.props.repoConfigs[repo.uri];
return config && !!config.init;
}
return false;
}
}
const mapStateToProps = (state: RootState) => ({
repositories: state.repository.repositories,
importLoading: state.repository.importLoading,
repoConfigs: state.repository.repoConfigs,
showCallOut: state.repository.showCallOut,
callOutMessage: state.repository.callOutMessage,
callOutType: state.repository.callOutType,
status: state.status.status,
isAdmin: state.userConfig.isAdmin,
});
const mapDispatchToProps = {
deleteRepo,
importRepo,
indexRepo,
initRepoCommand,
};
export const Admin = connect(
mapStateToProps,
mapDispatchToProps
)(AdminPage);

View file

@ -0,0 +1,15 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import styled from 'styled-components';
export const InlineProgressContainer = styled.div`
width: 30rem;
padding: 2px;
border: 1px solid;
margin: auto 10rem;
background-color: #d9d9d9;
`;

View file

@ -0,0 +1,36 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/
import React from 'react';
import { HashRouter as Router, Redirect, Switch } from 'react-router-dom';
import { Admin } from './admin_page/admin';
import { Diff } from './diff_page/diff';
import { Main } from './main/main';
import { NotFound } from './main/not_found';
import { Route } from './route';
import * as ROUTES from './routes';
import { Search } from './search_page/search';
const Empty = () => null;
export const App = () => {
const redirectToAdmin = () => <Redirect to="/admin" />;
return (
<Router>
<Switch>
<Route path={ROUTES.DIFF} component={Diff} />
<Route path={ROUTES.ROOT} exact={true} render={redirectToAdmin} />
<Route path={ROUTES.MAIN} component={Main} exact={true} />
<Route path={ROUTES.MAIN_ROOT} component={Main} />
<Route path={ROUTES.ADMIN} component={Admin} />
<Route path={ROUTES.SEARCH} component={Search} />
<Route path={ROUTES.REPO} render={Empty} exact={true} />
<Route path="*" component={NotFound} />
</Switch>
</Router>
);
};

Some files were not shown because too many files have changed in this diff Show more