812 Commits

Author SHA1 Message Date
Arthur Lu
5153fc3f82 Merge pull request #81 from titanscouting/fix-publish
removed problematic classifier
2021-04-30 20:54:16 -07:00
Arthur Lu
8bf754f382 removed problematic classifier 2021-05-01 03:47:10 +00:00
Arthur Lu
dde3a448c2 Merge pull request #79 from titanscouting/fix-publish
attempt 2 to fix publish-analysis
2021-04-30 20:36:20 -07:00
Arthur Lu
1e234efcba attempt 2 to fix publish-analysis 2021-04-30 22:59:34 +00:00
Dev Singh
34a834c7bc Merge pull request #78 from titanscouting/fix-publish
Install deps on publish-analysis
2021-04-30 17:43:20 -05:00
Arthur Lu
4910c335f1 install deps on publish-analysis 2021-04-30 22:40:49 +00:00
Arthur Lu
9f71ab3aad tra-analysis v 3.0.0 aggregate PR (#73)
* reflected doc changes to README.md

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* tra_analysis v 2.1.0-alpha.1

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* changed setup.py to use __version__ from source
added Topic and keywords

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* updated Supported Platforms in README.md

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* moved required files back to parent

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* moved security back to parent

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* moved security back to parent
moved contributing back to parent

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* add PR template

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* moved to parent folder

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* moved meta files to .github folder

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* Analysis.py v 3.0.1

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* updated test_analysis for submodules, and added missing numpy import in Sort.py

* fixed item one of Issue #58

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* readded cache searching in postCreateCommand

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* added myself as an author

* feat: created kivy gui boilerplate

* added Kivy to requirements.txt

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* feat: gui with placeholders

* fix: changed config.json path

* migrated docker base image to debian

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* style: spaces to tabs

* migrated to ubuntu

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* fixed issues

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* fix: docker build?

* fix: use ubuntu bionic

* fix: get kivy installed

* @ltcptgeneral can't spell

* optim dockerfile for not installing unused packages

* install basic stuff while building the container

* use prebuilt image for development

* install pylint on base image

* rename and use new kivy

* tests: added tests for Array and CorrelationTest

Both are not working due to errors

* use new thing

* use 20.04 base

* symlink pip3 to pip

* use pip instead of pip3

* equation.Expression.py v 0.0.1-alpha
added corresponding .pyc to .gitignore

* parser.py v 0.0.2-alpha

* added pyparsing to requirements.txt

* parser v 0.0.4-alpha

* Equation v 0.0.1-alpha

* added Equation to tra_analysis imports

* tests: New unit tests for submoduling (#66)

* feat: created kivy gui boilerplate

* migrated docker base image to debian

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* migrated to ubuntu

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* fixed issues

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* fix: docker build?

* fix: use ubuntu bionic

* fix: get kivy installed

* @ltcptgeneral can't spell

* optim dockerfile for not installing unused packages

* install basic stuff while building the container

* use prebuilt image for development

* install pylint on base image

* rename and use new kivy

* tests: added tests for Array and CorrelationTest

Both are not working due to errors

* fix: Array no longer has *args and CorrelationTest functions no longer have self in the arguments

* use new thing

* use 20.04 base

* symlink pip3 to pip

* use pip instead of pip3

* tra_analysis v 2.1.0-alpha.2
SVM v 1.0.1
added unvalidated SVM unit tests

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* fixed version number

Signed-off-by: ltcptgeneral <learthurgo@gmail.com>

* tests: added tests for ClassificationMetric

* partially fixed and commented out svm unit tests

* fixed some SVM unit tests

* added installing pytest to devcontainer.json

* fix: small fixes to KNN

Namely, removing self from parameters and passing correct arguments to KNeighborsClassifier constructor

* fix, test: Added tests for KNN and NaiveBayes.

Also made some small fixes in KNN, NaiveBayes, and RegressionMetric

* test: finished unit tests except for StatisticalTest

Also made various small fixes and style changes

* StatisticalTest v 1.0.1

* fixed RegressionMetric unit test
temporarily disabled CorrelationTest unit tests

* tra_analysis v 2.1.0-alpha.3

* readded __all__

* fix: floating point issues in unit tests for CorrelationTest

Co-authored-by: AGawde05 <agawde05@gmail.com>
Co-authored-by: ltcptgeneral <learthurgo@gmail.com>
Co-authored-by: Dev Singh <dev@devksingh.com>
Co-authored-by: jzpan1 <panzhenyu2014@gmail.com>

* fixed depreciated escape sequences

* ficed tests, indent, import in test_analysis

* changed version to 3.0.0
added backwards compatibility

* ficed pytest install in container

* removed GUI changes

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* incremented version to rc.1 (release candidate 1)

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* fixed NaiveBayes __changelog__

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* fix: __setitem__  == to single =

* Array v 1.0.1

* Revert "Array v 1.0.1"

This reverts commit 59783b79f7.

* Array v 1.0.1

* Array.py v 1.0.2
added more Array unit tests

* cleaned .gitignore
tra_analysis v 3.0.0-rc2

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* added *.pyc to gitignore
finished subdividing test_analysis

* feat: gui layout + basic func

* Froze and removed superscript (data-analysis)

* remove data-analysis deps install for devcontainer

* tukey pairwise comparison and multicomparison but no critical q-values

* quick patch for devcontainer.json

* better fix for devcontainer.json

* fixed some styling in StatisticalTest
removed print statement in StatisticalTest unit tests

* update analysis tests to be more effecient

* don't use loop for test_nativebayes

* removed useless secondary docker files

* tra-analysis v 3.0.0

Co-authored-by: James Pan <panzhenyu2014@gmail.com>
Co-authored-by: AGawde05 <agawde05@gmail.com>
Co-authored-by: zpan1 <72054510+zpan1@users.noreply.github.com>
Co-authored-by: Dev Singh <dev@devksingh.com>
Co-authored-by: = <=>
Co-authored-by: Dev Singh <dsingh@imsa.edu>
Co-authored-by: zpan1 <zpan@imsa.edu>
2021-04-28 19:33:50 -05:00
Arthur Lu
764dab01f6 reflected doc changes to README.md (#48)
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-10-05 09:49:39 -05:00
Dev Singh
56f5e5262c deps: remove dnspython (#47)
Signed-off-by: Dev Singh <dev@devksingh.com>

Co-authored-by: Arthur Lu <learthurgo@gmail.com>
2020-09-28 18:53:32 -05:00
Arthur Lu
56a5578f35 Merge pull request #46 from titanscouting/multithread-testing
Implement Multithreading in Superscript
2020-09-28 17:46:29 -05:00
Dev Singh
c48c512cf6 Implement fitting to circle using LSC and HyperFit (#45)
* chore: add pylint to devcontainer

Signed-off-by: Dev Singh <dev@devksingh.com>

* feat: init LSC fitting

cuda and cpu-based LSC fitting using cupy and numpy

Signed-off-by: Dev Singh <dev@devksingh.com>

* docs: add changelog entry and module to class list

Signed-off-by: Dev Singh <dev@devksingh.com>

* docs: fix typo in comment

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: only import cupy if cuda available

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: move to own file, abandon cupy

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: remove numba dep

Signed-off-by: Dev Singh <dev@devksingh.com>

* deps: remove cupy dep

Signed-off-by: Dev Singh <dev@devksingh.com>

* feat: add tests

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: correct indentation

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: variable names

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: add self when refering to coords

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: numpy ordering

Signed-off-by: Dev Singh <dev@devksingh.com>

* docs: remove version bump, nomaintain

add notice that module is not actively maintained, may be removed in future release

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix: remove hyperfit as not being impled

Signed-off-by: Dev Singh <dev@devksingh.com>
2020-09-24 21:06:30 -05:00
Dev Singh
d15aa045de docs: create security reporting guidelines (#44)
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-09-24 13:09:34 -05:00
Arthur Lu
b32083c6da added tra-analysis to data-analysis requirements
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-24 13:14:13 +00:00
Arthur Lu
a999c755a1 Merge branch 'multithread-testing' of https://github.com/titanscouting/red-alliance-analysis into multithread-testing 2020-09-26 20:57:55 +00:00
Arthur Lu
e3241fa34d superscript.py v 0.8.2
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-26 20:57:39 +00:00
Dev Singh
97f3271de3 Merge branch 'master' into multithread-testing 2020-09-26 15:28:14 -05:00
Arthur Lu
2804d03593 superscript.py v 0.8.1
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-21 07:38:18 +00:00
Arthur Lu
adbc749c47 added max-threads key in config
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-21 07:21:59 +00:00
Arthur Lu
ec9bac7830 superscript.py v 0.8.0
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-21 05:59:15 +00:00
Arthur Lu
b9a2e680bc Merge pull request #43 from titanscouting/master-staged
Pull changes from master staged to master for release
2020-09-19 21:06:42 -05:00
Arthur Lu
467444ed9b Merge branch 'master' into master-staged 2020-09-19 20:05:33 -05:00
Arthur Lu
fa7216d4e0 modified setup.py for analysis package v 2.1.0
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-20 00:50:14 +00:00
Arthur Lu
27a86e568b depreciated nonfunctional scripts in data-analysis
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-20 00:47:33 +00:00
Arthur Lu
16502c5259 superscript.py v 0.7.0
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-20 00:45:38 +00:00
Arthur Lu
ff9ad078e5 analysis.py v 2.3.1
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-19 23:14:46 +00:00
Arthur Lu
97334d1f66 edited README.md
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-19 22:40:20 +00:00
Arthur Lu
f566f4ec71 Merge pull request #42 from titanscouting/devksingh4-patch-1
docs: add documentation links
2020-09-19 17:07:57 -05:00
Arthur Lu
cd869c0a8e analysis.py v 2.3.0
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-19 22:04:24 +00:00
Arthur Lu
f1982eb93d analysis.py v 2.2.3
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-18 21:55:59 +00:00
Arthur Lu
3763cb041f analysis.py v 2.2.2
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-17 02:11:44 +00:00
Dev Singh
2a201a61c7 docs: add documentation links 2020-09-16 16:54:49 -05:00
Arthur Lu
73a16b8397 added depreciated config files to gitignore
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-16 21:24:50 +00:00
Arthur Lu
0e7255ab99 changed && to ; in devcontainer.json
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-09-15 23:24:50 +00:00
Arthur Lu
5efaee5176 Merge pull request #41 from titanscouting/master-staged
merge eol fix in master-staged to master
2020-08-13 12:04:54 -05:00
Arthur Lu
1a1be8ee6a fixed eol issue with docker in gitattributes
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-13 17:01:08 +00:00
Arthur Lu
cab05fbc63 Merge commit '4b664acffb5777614043a83ef8e08368e21303ce' into master-staged 2020-08-13 17:00:31 +00:00
Dev Singh
4b664acffb Modernize VSCode extensions in dev env, set correct copyright assignment (#40)
* modernize extensions

Signed-off-by: Dev Singh <dev@devksingh.com>

* copyright assigment should be to titan scouting

Signed-off-by: Dev Singh <dev@devksingh.com>
2020-08-12 21:59:04 -05:00
Arthur Lu
292f9faeef Merge pull request #39 from titanscouting/master-staged
merge README changes from master-staged to master
2020-08-10 20:49:01 -05:00
Arthur Lu
468bd48b07 fixed readme with proper pip installation
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-11 01:36:30 +00:00
Arthur Lu
4c3f16f13b Merge pull request #38 from titanscouting/master
pull master into master-staged
2020-08-10 20:33:28 -05:00
Arthur Lu
8545a0d984 Merge pull request #36 from titanscouting/tra-service
merge changes from tra-service to master
2020-08-10 19:40:28 -05:00
Arthur Lu
6debc07786 modified README
simplified devcontainer.json

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-11 00:29:23 +00:00
Arthur Lu
bc5b07bb8d readded old config files
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-10 23:32:50 +00:00
Arthur Lu
9b73147c4d fixed analysis reference in superscript_old
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-10 23:20:43 +00:00
Arthur Lu
2f84debda7 removed old bins under analysis-master/dist/
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-10 21:37:41 +00:00
Arthur Lu
c803208eb8 analysis.py v 2.2.1
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-10 21:25:25 +00:00
Arthur Lu
135350293c Merge branch 'master' into tra-service 2020-08-10 16:11:38 -05:00
Arthur Lu
9a3181a92b renamed analysis folder to tra_analysis
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-08-10 21:01:50 +00:00
Dev Singh
73da5fa68b docs
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-08-10 14:53:22 -05:00
Dev Singh
7be57f7e7e build v2.0.3
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-08-10 14:52:49 -05:00
Dev Singh
7d64e67ad3 run on publish
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-08-10 14:46:07 -05:00
Dev Singh
def639284f remove bad if statement
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-08-10 14:43:16 -05:00
Dev Singh
18430208ff Merge branch 'master' of https://github.com/titanscout2022/red-alliance-analysis 2020-08-10 14:42:58 -05:00
Dev Singh
ba5fb2d72b build on release only (#35)
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-08-10 14:40:22 -05:00
Dev Singh
f34452d584 build on release only
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-08-10 14:35:38 -05:00
Dev Singh
5fd5e32cb1 Implement CD with building on tags to PyPI (#34)
* Create python-publish.yml

* populated publish-analysis.yml
moved legacy versions of analysis to seperate subfolder

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* attempt to fix issue with publish action

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* another attempt o fix publish-analysis.yml

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* this should work now

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* pypa can't take just one package so i'm trying all

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* this should totally work now

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* trying removing custom dir

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* rename analysis to tra_analysis, bump version to 2.0.0

* remove old packages which are already on github releases

* remove pycache

* removed ipynb_checkpoints

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* build

* do the dir thing

* trying removing custom dir

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
Signed-off-by: Dev Singh <dev@devksingh.com>

* rename analysis to tra_analysis, bump version to 2.0.0

Signed-off-by: Dev Singh <dev@devksingh.com>

* remove old packages which are already on github releases

Signed-off-by: Dev Singh <dev@devksingh.com>

* remove pycache

Signed-off-by: Dev Singh <dev@devksingh.com>

* build

Signed-off-by: Dev Singh <dev@devksingh.com>

* removed ipynb_checkpoints

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
Signed-off-by: Dev Singh <dev@devksingh.com>

* do the dir thing

Signed-off-by: Dev Singh <dev@devksingh.com>

* Revert "do the dir thing"

This reverts commit 2eb7ffca8d.

* correct dir

* set correct yaml positions

Signed-off-by: Dev Singh <dev@devksingh.com>

* attempt to set correct dir

Signed-off-by: Dev Singh <dev@devksingh.com>

* run on tags only

Signed-off-by: Dev Singh <dev@devksingh.com>

* remove all caches from vcs

Signed-off-by: Dev Singh <dev@devksingh.com>

* bump version for testing

Signed-off-by: Dev Singh <dev@devksingh.com>

* remove broke build

Signed-off-by: Dev Singh <dev@devksingh.com>

* dont upload dists to github

Signed-off-by: Dev Singh <dev@devksingh.com>

* bump to 2.0.2 for testing

Signed-off-by: Dev Singh <dev@devksingh.com>

* fix yaml

Signed-off-by: Dev Singh <dev@devksingh.com>

* update docs

Signed-off-by: Dev Singh <dev@devksingh.com>

* add to readme

Signed-off-by: Dev Singh <dev@devksingh.com>

* run only on master

Signed-off-by: Dev Singh <dev@devksingh.com>

Co-authored-by: Arthur Lu <learthurgo@gmail.com>
Co-authored-by: Dev Singh <dsingh@CentaurusRidge.localdomain>
2020-08-10 14:29:51 -05:00
Arthur Lu
3db3dda315 Merge pull request #33 from titanscout2022/Demo-for-Issue#32
Merge Changes Proposed in Issue#32
2020-08-02 17:27:26 -05:00
Arthur Lu
a59e509bc8 made changes described in Issue#32
changed setup.py to also reflect versioning changes

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-07-30 19:05:07 +00:00
Arthur Lu
ad521368bd filled out Contributing section in README.md
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-07-20 19:07:32 -05:00
Dev Singh
edbfa5184a Update MAINTAINERS (#29)
Signed-off-by: Dev Singh <dev@devksingh.com>
2020-07-19 11:52:11 -05:00
Arthur Lu
5e52155fd0 Merge pull request #31 from titanscout2022/master
merge changes from master into tra-service
2020-07-18 23:25:55 -05:00
Arthur Lu
635f736a69 Merge pull request #28 from titanscout2022/master-staged
Merge analysis.py updates to master
2020-07-12 18:26:03 -05:00
Arthur Lu
16fb21001a added negatives to analysis unit tests
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-07-12 13:57:24 -05:00
Arthur Lu
69559e9e4a Merge branch 'master' into master-staged 2020-07-11 17:03:50 -05:00
Arthur Lu
430822cdeb added unit tests for analysis.Sort algorithms
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-07-11 21:53:16 +00:00
Arthur Lu
daa5b48426 readded old superscript.py (v 0.0.5.002)
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-07-11 21:21:56 +00:00
Arthur Lu
648ac945ac analysis v 1.2.2.000
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-07-05 05:30:48 +00:00
Arthur Lu
b2cf594869 readded tra.py as a fallback option
made changes to tra-cli.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 23:15:34 +00:00
Arthur Lu
bcd6c66a08 fixed more bugs with tra-cli.py
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 21:47:54 +00:00
Arthur Lu
b646e22378 fixed bugs with tra-cli.py
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 21:32:43 +00:00
Arthur Lu
51f14de0d2 fixed latest.whl to follow format for wheel files
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 20:56:13 +00:00
Arthur Lu
266caf78c3 started on tra-cli.py
modified tasks.py to work properly

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 20:23:53 +00:00
Arthur Lu
fa478314da added data-analysis requirements to devcontainer build
added auto pip intsall latest analysis.py whl

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 18:25:41 +00:00
Arthur Lu
8a212a21df moved core functions in tasks.py to class Tasker
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 18:19:58 +00:00
Arthur Lu
236c28c3be renamed tra;py to tasks.py
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-06-10 17:46:40 +00:00
Arthur Lu
7c2f058feb added help message to status command
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-26 01:34:47 +00:00
Arthur Lu
e84783ee44 populated tra.py to be a CLI application
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-25 22:17:08 +00:00
Arthur Lu
09b703d2a7 removed extra words
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 17:56:00 +00:00
Arthur Lu
098326584a removed more extra lines
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 17:54:48 +00:00
Arthur Lu
e5c7718f10 fixed extra hline
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 17:52:25 +00:00
Arthur Lu
a3ffdd89d0 fixed line breaks
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 17:51:57 +00:00
Arthur Lu
2fc11285ba fixed Prerequisites in README.md
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 17:35:02 +00:00
Arthur Lu
9dd38fcec8 added OS and python versions supproted
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 17:30:01 +00:00
Arthur Lu
d59d069943 analysis.py v 1.2.1.004 (#27)
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 11:49:04 -05:00
Arthur Lu
90f747f3fc revamped README.md
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-24 16:42:58 +00:00
Arthur Lu
869d7c288b fixed naming in tra.py
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-23 22:51:58 -05:00
Arthur Lu
dc4f5ab40e another bug fix
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-23 22:49:38 -05:00
Arthur Lu
a739007222 quick bug fix to tra.py
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-23 22:48:50 -05:00
Arthur Lu
ba06b9293e added test.py to .gitignore
prepared tra.py for threading implement

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-23 19:43:59 -05:00
Arthur Lu
1d5a67c4f7 analysis.py v 1.2.1.004
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-22 00:37:39 +00:00
Arthur Lu
e4ab0487d0 Merge pull request #26 from titanscout2022/master
Merge master into master-staged
2020-05-21 19:36:56 -05:00
Arthur Lu
4f439d6094 Merge service-dev changes with master (#24)
* added config.json
removed old config files

Signed-off-by: Arthur <learthurgo@gmail.com>

* superscript.py v 0.0.6.000

Signed-off-by: Arthur <learthurgo@gmail.com>

* changed data.py

Signed-off-by: Arthur <learthurgo@gmail.com>

* changes to config.json

Signed-off-by: Arthur <learthurgo@gmail.com>

* removed cells from visualize_pit.py

Signed-off-by: Arthur <learthurgo@gmail.com>

* more changes to visualize_pit.py

Signed-off-by: Arthur <learthurgo@gmail.com>

* added analysis-master/metrics/__pycache__ to git ignore
moved pit configs in config.json to the borrom
superscript.py v 0.0.6.001

Signed-off-by: Arthur <learthurgo@gmail.com>

* removed old database key

Signed-off-by: Arthur <learthurgo@gmail.com>

* adjusted config files

Signed-off-by: Arthur <learthurgo@gmail.com>

* Delete config-pop.json

* fixed .gitignore

Signed-off-by: Arthur <learthurgo@gmail.com>

* analysis.py 1.2.1.003
added team kv pair to config.json

Signed-off-by: Arthur <learthurgo@gmail.com>

* superscript.py v 0.0.6.002

Signed-off-by: Arthur <learthurgo@gmail.com>

* finished app.py API
made minute changes to parentheses use in various packages

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* bug fixes in app.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* bug fixes in app.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* made changes to .gitignore

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* made changes to .gitignore

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* deleted a __pycache__ folder from metrics

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* more changes to .gitignore

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* additions to app.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* renamed app.py to api.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* removed extranneous files

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* renamed api.py to tra.py
removed rest api calls from tra.py

* renamed api.py to tra.py
removed rest api calls from tra.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* removed flask import from tra.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* changes to devcontainer.json

Signed-off-by: Arthur Lu <learthurgo@gmail.com>

* fixed unit tests to be correct
removed some tests regressions because of potential function overflow
removed trueskill unit test because of slight deviation chance

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-20 08:52:38 -05:00
Arthur Lu
ae64c7f10e Merge pull request #25 from titanscout2022/master-staged
fixed bug in analysis unit tests
2020-05-19 13:19:47 -05:00
Arthur Lu
d1dfe3b01b Merge branch 'master' into master-staged 2020-05-19 13:19:40 -05:00
Arthur Lu
3dd24dcd30 fixed bug in analysis unit tests
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-19 18:19:02 +00:00
Arthur Lu
2be67b2cc3 Merge pull request #23 from titanscout2022/master-staged
Merge minor .gitignore changes
2020-05-18 16:31:50 -05:00
Arthur Lu
f91159c49c added data-analysis/.pytest_cache/ to .gitignore
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 16:28:42 -05:00
Arthur Lu
df046d4806 Merge pull request #22 from titanscout2022/master
Reflect master to master-staged
2020-05-18 16:28:05 -05:00
Arthur Lu
c838c4fc15 Merge pull request #21 from titanscout2022/CI-tools
CI tools
2020-05-18 16:18:48 -05:00
Arthur Lu
cbf5d18332 i swear its working now
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 16:14:16 -05:00
Arthur Lu
641905e87a finally fixed issues
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 16:12:22 -05:00
Arthur Lu
3daa12a3da changes
superscript testing still refuses to collect any tests

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 16:07:02 -05:00
Arthur Lu
3c4fe7ab46 still not working
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 16:01:02 -05:00
Arthur Lu
4e3f6b4480 readded pytest install
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:59:34 -05:00
Arthur Lu
414ffdf96c removed flake8 import from unit tests
fixed superscript unit tests

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:58:17 -05:00
Arthur Lu
6296f78ff5 removed lint checks because it was the stupid
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:54:15 -05:00
Arthur Lu
7ae64d5dbf lint refused to exclude metrics
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:51:51 -05:00
Arthur Lu
fd2ac12dad excluded imports
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:49:52 -05:00
Arthur Lu
0f2bbd1a16 more fixes
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:44:39 -05:00
Arthur Lu
83bc7fa743 Merge branch 'CI-tools' of https://github.com/titanscout2022/red-alliance-analysis into CI-tools
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:44:20 -05:00
Arthur Lu
83eabce8cd also ignored regression.py
added temporary unit test for superscript.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:43:53 -05:00
Arthur Lu
e2e73986a2 also ignored regression.py
added temporary unit test for superscript.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:43:36 -05:00
Arthur Lu
91ae1c0df6 attempted fixes by excluding titanlearn
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:39:59 -05:00
Arthur Lu
efad5bd71c maybe its a versioning issue?
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:32:24 -05:00
Arthur Lu
3d5e0aac59 Revert "trying python3 and pip3"
This reverts commit 7937fb6ee6.
2020-05-18 15:29:51 -05:00
Arthur Lu
7937fb6ee6 trying python3 and pip3
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:27:56 -05:00
Arthur Lu
871ecb5561 attempt to fix working directory issue
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:25:19 -05:00
Arthur Lu
7d738ca51e another attempt
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:11:24 -05:00
Arthur Lu
eeee957d23 attempt to fix working directory issues
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 15:07:42 -05:00
Arthur Lu
f55f3cb7d1 populated analysis unit test
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-18 14:59:24 -05:00
Arthur Lu
dd11689c8c reverted indentation to github default
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 20:15:43 -05:00
Arthur Lu
1c4b1d1971 more indentation fixes
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 20:12:15 -05:00
Arthur Lu
94a7aae491 changed indentation to spaces
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 20:09:29 -05:00
Arthur Lu
26f4224caa fixed indents
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 20:07:44 -05:00
Arthur Lu
386b7c75ee added items to .gitignore
renamed pythonpackage.yml to ut-analysis.yml
populated ut-analysis.yml
fixed spelling
added ut-superscript.py

Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 20:04:31 -05:00
Arthur Lu
27feb0bf93 moved unit-test.py outside the analysis folder
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 19:41:19 -05:00
Arthur Lu
233440f03d removed pythonapp because it is redundant
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 19:40:35 -05:00
Arthur Lu
37c247aa46 created unit-test.py
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-16 19:33:56 -05:00
Arthur Lu
eeb5e46814 Merge pull request #19 from titanscout2022/CI-package
merge
2020-05-16 19:31:25 -05:00
Arthur Lu
4739c439f0 Create pythonpackage.yml 2020-05-16 19:30:52 -05:00
Arthur Lu
2e41326373 Create pythonapp.yml 2020-05-16 19:29:14 -05:00
Arthur Lu
e8ba8e1008 Merge pull request #18 from titanscout2022/master-staged
analysis.py v 1.2.1.003
2020-05-15 16:06:02 -05:00
Arthur Lu
dd49f6724f Merge branch 'master' into master-staged 2020-05-15 16:05:52 -05:00
Arthur Lu
b376f7c0c5 Merge pull request #17 from titanscout2022/equation.py-testing
merge equation.py-testing with master
2020-05-15 16:01:41 -05:00
Arthur Lu
4213386035 Merge branch 'master' into master-staged 2020-05-15 14:54:24 -05:00
Arthur Lu
3fdae646b8 analysis.py v 1.2.1.003
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-15 14:48:26 -05:00
Arthur Lu
8f8fb6c156 analysis.py v 1.2.2.000
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-14 23:36:28 -05:00
Arthur Lu
30b39aafff Merge pull request #16 from titanscout2022/master
pull recent changes into equation.py-testing
2020-05-14 23:22:03 -05:00
ltcptgeneral
77353c87e3 Merge pull request #15 from titanscout2022/master-staged
mirrored .gitignore changes from gui-dev
2020-05-14 19:29:44 -05:00
ltcptgeneral
ca2ebe5f6d Merge branch 'master' into master-staged 2020-05-14 19:18:34 -05:00
Arthur
55c7589c7d mirrored .gitignore changes from gui-dev
Signed-off-by: Arthur <learthurgo@gmail.com>
2020-05-14 19:17:26 -05:00
ltcptgeneral
6cff61cbe4 Merge pull request #13 from titanscout2022/devksingh4-bsd-license
Switch to BSD License
2020-05-13 13:19:10 -05:00
Dev Singh
5474081523 Update LICENSE 2020-05-13 12:04:59 -05:00
Dev Singh
4c25a5ce09 Contributing guideline changes (#11)
* changes

* more changes

* more changes

* contributing.md: sync with other contributor docs

Signed-off-by: Dev Singh <dev@singhk.dev>

* Create MAINTAINERS

Signed-off-by: Dev Singh <dev@singhk.dev>

* arthur's changes

* changes

Signed-off-by: ltcptgeneral <35508619+ltcptgeneral@users.noreply.github.com>

* more changes

Signed-off-by: ltcptgeneral <35508619+ltcptgeneral@users.noreply.github.com>

* more changes

Signed-off-by: ltcptgeneral <35508619+ltcptgeneral@users.noreply.github.com>

* contributing.md: sync with other contributor docs

Signed-off-by: Dev Singh <dev@singhk.dev>
Signed-off-by: ltcptgeneral <35508619+ltcptgeneral@users.noreply.github.com>

* Create MAINTAINERS

Signed-off-by: Dev Singh <dev@singhk.dev>
Signed-off-by: ltcptgeneral <35508619+ltcptgeneral@users.noreply.github.com>

* arthur's changes

Signed-off-by: ltcptgeneral <35508619+ltcptgeneral@users.noreply.github.com>

Co-authored-by: ltcptgeneral <35508619+ltcptgeneral@users.noreply.github.com>
2020-05-13 11:56:52 -05:00
ltcptgeneral
3451bac6f5 Merge pull request #12 from titanscout2022/master-staged
analysis.py v 1.2.1.002
2020-05-13 11:44:25 -05:00
ltcptgeneral
7e37dd72bb analysis.py v 1.2.1.002
Signed-off-by: Arthur Lu <learthurgo@gmail.com>
2020-05-13 11:35:46 -05:00
ltcptgeneral
a9014c5d34 changed data analysis folder to data-analysis
added egg-info and build folders to git ignore
deleted egg-info and build folders
2020-05-12 20:54:19 -05:00
ltcptgeneral
230e98a745 9 2020-05-12 20:48:45 -05:00
ltcptgeneral
1c6ecb149b Merge branch 'equation.py-testing' of https://github.com/titanscout2022/tr2022-strategy into equation.py-testing 2020-05-12 20:46:51 -05:00
ltcptgeneral
6d544a434e readded equation.ipynb 2020-05-12 20:46:42 -05:00
ltcptgeneral
5a1aa780ff readded equation.ipynb 2020-05-12 20:43:31 -05:00
ltcptgeneral
952981cdb9 bug fixes 2020-05-12 20:39:23 -05:00
ltcptgeneral
6fee42f6d2 bug fixes 2020-05-12 20:21:11 -05:00
ltcptgeneral
24f8961500 analysis.py v 1.2.1.001 2020-05-12 20:19:58 -05:00
ltcptgeneral
db8fbbf068 visualization.py v 1.0.0.001 2020-05-05 22:37:32 -05:00
ltcptgeneral
64ae1b7026 analysis.py v 1.2.1.000 2020-05-04 14:50:36 -05:00
ltcptgeneral
4498387ac5 analysis.py v 1.2.0.006 2020-05-04 11:59:25 -05:00
ltcptgeneral
7a362476c9 fixed indent part 2 2020-05-01 23:16:32 -05:00
ltcptgeneral
b79cedae68 fixed indentation 2020-05-01 23:14:19 -05:00
ltcptgeneral
2bcd4236bb moved equation.ipynb to correct folder 2020-05-01 23:06:32 -05:00
ltcptgeneral
0cc35dc02d Merge pull request #10 from titanscout2022/master
merge file changes from master into equation.py-testing
2020-05-01 23:04:33 -05:00
ltcptgeneral
43bb9ef2bb analysis.py v 1.2.0.005 2020-05-01 22:59:54 -05:00
ltcptgeneral
3ab1d0f50a converted space indentation to tab indentation 2020-05-01 16:15:07 -05:00
ltcptgeneral
88e7c52c8b fixes 2020-05-01 16:07:57 -05:00
ltcptgeneral
b345bfb95b reconsolidated arm64 and amd64 versions 2020-05-01 15:52:27 -05:00
ltcptgeneral
aeb4990c81 analysis pkg v 1.0.0.12
analysis.py v 1.2.0.004
2020-04-30 16:03:37 -05:00
ltcptgeneral
0a721ca500 8 2020-04-30 15:22:37 -05:00
ltcptgeneral
37a4a0085e 7 2020-04-29 23:02:02 -05:00
ltcptgeneral
429d3eb42c 6 2020-04-29 22:34:43 -05:00
ltcptgeneral
60ffe7645b 5 2020-04-29 19:01:55 -05:00
ltcptgeneral
adfa6f5cc0 4 2020-04-29 17:24:59 -05:00
ltcptgeneral
f9c25dad09 3 2020-04-29 12:58:44 -05:00
ltcptgeneral
b1d5834ff1 2 2020-04-29 00:35:19 -05:00
ltcptgeneral
357d4977eb 1 2020-04-29 00:34:16 -05:00
ltcptgeneral
4545f5721a analysis.py v 1.2.0.003 2020-04-28 04:00:19 +00:00
ltcptgeneral
8d703b10b3 analysis.py v 1.2.0.002 2020-04-22 03:32:34 +00:00
ltcptgeneral
df305f30f0 analysis.py v 1.2.0.001 2020-04-21 04:08:00 +00:00
ltcptgeneral
a123b71ac9 Merge pull request #9 from titanscout2022/fix
testing release 1.2 of analysis.py
2020-04-20 00:10:24 -05:00
ltcptgeneral
a02668e59c Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2020-04-20 05:07:17 +00:00
ltcptgeneral
4d6372f620 removed depreciated files to seperate repository 2020-04-20 05:07:07 +00:00
ltcptgeneral
9d0b6e68d8 Update README.md 2020-04-20 00:02:35 -05:00
ltcptgeneral
b8d51811e0 testing release 1.2 of analysis.py 2020-04-13 19:58:04 +00:00
ltcptgeneral
7a58cd08e2 analysis pkg v 1.0.0.11
analysis.py v 1.1.13.009
superscript.py v 0.0.5.002
2020-04-12 02:51:40 +00:00
ltcptgeneral
337fae68ee analysis pkg v 1.0.0.10
analysis.py v 1.1.13.008
superscript.py v 0.0.5.001
2020-04-09 22:16:26 -05:00
art
5e71d05626 removed app from dep 2020-04-05 21:42:12 +00:00
art
01df42aa49 added gitgraph to vscode container 2020-04-05 21:36:12 +00:00
ltcptgeneral
33eea153c1 Merge pull request #8 from titanscout2022/containerization-testing
Containerization testing
2020-04-05 16:32:40 -05:00
art
114eee5d57 finalized changes to docker implements 2020-04-05 21:29:16 +00:00
ltcptgeneral
06f008746a Merge pull request #7 from titanscout2022/master
merge
2020-04-05 14:57:56 -05:00
art
4f9c4e0dbb verified and tested docker files 2020-04-05 19:53:01 +00:00
art
5697e8b79e created dockerfiles 2020-04-05 19:04:07 +00:00
ltcptgeneral
e054e66743 started on dockerfile 2020-04-05 12:46:21 -05:00
ltcptgeneral
c914bd3754 removed unessasary comment 2020-04-04 11:59:19 -05:00
ltcptgeneral
6c08885a53 created two new analysis variants
the existing amd64
new unpopulated arm64
2020-04-04 00:09:40 -05:00
ltcptgeneral
375befd0c4 analysis pkg v 1.0.0.9 2020-03-17 20:03:49 -05:00
ltcptgeneral
893d1fb1d0 analysis.py v 1.1.13.007 2020-03-16 22:05:52 -05:00
ltcptgeneral
6a426ae4cd a 2020-03-10 00:45:42 -05:00
ltcptgeneral
50c064ffa4 worked 2020-03-09 22:58:51 -05:00
ltcptgeneral
1b0a9967c8 test1 2020-03-09 22:58:11 -05:00
ltcptgeneral
2605f7c29f Merge pull request #6 from titanscout2022/testing
Testing
2020-03-09 20:42:30 -05:00
ltcptgeneral
6f5a3edd88 superscript.py v 0.0.5.000 2020-03-09 20:35:11 -05:00
ltcptgeneral
457146b0e4 working 2020-03-09 20:29:44 -05:00
ltcptgeneral
f7fd8ffcf9 working 2020-03-09 20:18:30 -05:00
art
77bc792426 removed unessasary stuff 2020-03-09 10:29:59 -05:00
ltcptgeneral
39146cc555 Merge pull request #5 from titanscout2022/comp-edits
Comp edits
2020-03-09 10:28:48 -05:00
ltcptgeneral
04141bbec8 analysis.py v 1.1.13.006
regression.py v 1.0.0.003
analysis pkg v 1.0.0.8
2020-03-08 16:48:19 -05:00
ltcptgeneral
40e5899972 added get_team_rakings.py 2020-03-08 14:26:21 -05:00
ltcptgeneral
025c7f9b3c a 2020-03-06 21:39:46 -06:00
Dev Singh
2daa09c040 hi 2020-03-06 21:21:37 -06:00
ltcptgeneral
9776136649 superscript.py v 0.0.4.002 2020-03-06 21:09:16 -06:00
Dev Singh
68d27a6302 add reqs 2020-03-06 20:44:40 -06:00
Dev Singh
7fc18b7c35 add Procfile 2020-03-06 20:41:53 -06:00
ltcptgeneral
9b412b51a8 analysis pkg v 1.0.0.7 2020-03-06 20:32:41 -06:00
ltcptgeneral
b6ac05a66e Merge pull request #4 from titanscout2022/comp-edits
Comp edits merge
2020-03-06 20:29:50 -06:00
Dev Singh
435c8a7bc6 tiny brain fix 2020-03-06 14:52:41 -06:00
Dev Singh
a69b18354b ultimate carl the fat kid brain working 2020-03-06 14:50:54 -06:00
Dev Singh
7b9e6921d0 ultra galaxybrain working 2020-03-06 14:44:13 -06:00
Dev Singh
fb2800cf9e fix 2020-03-06 13:12:01 -06:00
Dev Singh
12cbb21077 super ultra working 2020-03-06 12:43:01 -06:00
Dev Singh
46d1a48999 even more working 2020-03-06 12:21:17 -06:00
Dev Singh
ad0a761d53 more working 2020-03-06 12:18:42 -06:00
Dev Singh
43f503a38d working 2020-03-06 12:15:35 -06:00
Dev Singh
d38744438b working 2020-03-06 11:50:07 -06:00
Dev Singh
eb8914aa26 maybe working 2020-03-06 11:27:32 -06:00
Dev Singh
283140094f a 2020-03-06 11:18:02 -06:00
Dev Singh
66ac1c304e testing part 2 better electric boogaloo 2020-03-06 11:16:24 -06:00
Dev Singh
0eb9e07711 testing 2020-03-06 11:14:10 -06:00
Dev Singh
f56c85b298 10:57 2020-03-06 10:57:39 -06:00
Dev Singh
6a9a17c5b4 10:43 2020-03-06 10:43:45 -06:00
Dev Singh
e24c49bedb 10:25 2020-03-06 10:25:20 -06:00
Dev Singh
2daed73aaa 10:21 unverified 2020-03-06 10:21:23 -06:00
art
8ebdb3b89b superscript.py v 0.0.3.000 2020-03-05 22:52:02 -06:00
art
a0e1293361 analysis.py v 1.1.13.001
analysis pkg v 1.0.0.006
2020-03-05 13:18:33 -06:00
art
b669e55283 analysis pkg v 1.0.0.005 2020-03-05 12:44:09 -06:00
art
3e38446eae analysis pkg v 1.0.0.004 2020-03-05 12:29:58 -06:00
art
dac0a4a0cd analysis.py v 1.1.13.000 2020-03-05 12:28:16 -06:00
art
897ba03078 removed unessasaary folders and files 2020-03-05 11:17:49 -06:00
ltcptgeneral
e815a2fbf7 superscript.py v 0.0.2.001 2020-03-04 23:59:50 -06:00
ltcptgeneral
941383de4b analysis.py v 1.1.12.006
analysis pkg v 1.0.0.003
2020-03-04 21:20:00 -06:00
ltcptgeneral
5771c7957e a 2020-03-04 20:15:03 -06:00
ltcptgeneral
72c233649d superscript.py v 0.0.1.004 2020-03-04 20:12:09 -06:00
ltcptgeneral
c7031361b0 analysis.py v 1.1.12.005
analysis pkg v 1.0.0.002
2020-03-04 18:55:45 -06:00
ltcptgeneral
59508574c9 analysis pkg 1.0.0.001 2020-03-04 17:54:30 -06:00
ltcptgeneral
d57d1ebc6d analysis.py v 1.1.12.004 2020-03-04 17:52:07 -06:00
ltcptgeneral
70b2ff1151 superscript.py v 0.0.1.003 2020-03-04 16:53:25 -06:00
ltcptgeneral
c3746539b3 superscript.py v 0.0.1.002 2020-03-04 15:57:20 -06:00
ltcptgeneral
405ab3ac74 a 2020-03-04 13:47:56 -06:00
ltcptgeneral
94dd51566a refactors 2020-03-04 13:42:54 -06:00
ltcptgeneral
b5718a500a a 2020-03-04 12:58:57 -06:00
ltcptgeneral
2eaa390f2f d 2020-03-04 12:37:58 -06:00
art
9c666b95be moved analysis-master out of data analysis 2020-03-03 22:38:34 -06:00
art
dfc01a13bd c 2020-03-03 21:04:47 -06:00
art
d4328e6027 changed setup.py back 2020-03-03 21:04:17 -06:00
art
f9a3150438 b 2020-03-03 21:00:26 -06:00
art
6decf183dd a 2020-03-03 20:59:52 -06:00
art
67f940eadb made license explit in setup.py 2020-03-03 20:55:46 -06:00
art
56d0578d86 recompiled analysis.py 2020-03-03 20:48:50 -06:00
art
5e9e90507b packagefied analysis (finally) 2020-03-03 20:30:54 -06:00
art
8b4c50827c added setup.py 2020-03-03 20:24:49 -06:00
art
f8cdd73655 created __init__.py for analysis package 2020-03-03 20:17:05 -06:00
art
74dc02ca99 superscript.py v 0.0.1.001 2020-03-03 20:10:29 -06:00
art
5915827d15 superscript.py v 0.0.1.000 2020-03-03 19:39:58 -06:00
art
f9b0343aa1 moved app in dep 2020-03-03 18:48:17 -06:00
art
938caa75d1 superscript.py v 0.0.0.009
changes to config.csv
2020-03-03 18:40:35 -06:00
art
df66d28959 changes, testing 2020-03-03 18:13:03 -06:00
art
2710642f15 superscript.pv v 0.0.0.008
data.py created
2020-03-03 18:02:24 -06:00
art
51b3dd91b5 removed \n s 2020-03-03 16:27:30 -06:00
art
d00cf142c0 superscript.py v 0.0.0.007 2020-03-03 16:01:07 -06:00
art
ae8706ac08 superscript.py v 0.0.0.006 2020-03-03 15:42:37 -06:00
ltcptgeneral
5305e4a30f Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2020-02-29 01:06:06 -06:00
ltcptgeneral
908a1cd368 a 2020-02-29 01:05:58 -06:00
art
19e0044e0e a 2020-02-26 08:58:27 -06:00
Dev Singh
7ad43e970f Create LICENSE 2020-02-23 13:19:40 -06:00
Dev Singh
fbb3fde754 why are we unlicense? 2020-02-23 13:18:37 -06:00
art
81c81bed94 a 2020-02-20 19:29:21 -06:00
art
f3fc4cefd0 something changed 2020-02-20 19:27:09 -06:00
art
376ea248a4 a 2020-02-20 19:22:06 -06:00
art
9824f9349d fixed jacob 2020-02-20 19:19:20 -06:00
art
eb90582db8 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2020-02-20 19:12:48 -06:00
art
bad9e497b1 a 2020-02-20 19:12:33 -06:00
jlevine18
c3b993cfce tba_match_result_request.py v0.0.1 2020-02-19 21:50:56 -06:00
art
2cb5c54d8b dep 2020-02-19 19:54:59 -06:00
art
7f705915f0 fixes 2020-02-19 19:53:23 -06:00
art
2a8a21b82a something 2020-02-19 19:52:31 -06:00
art
2e09cba94e superscript v 0.0.0.005 2020-02-19 19:51:45 -06:00
art
de9d151ad6 superscript.py v 0.0.0.004 2020-02-19 19:21:48 -06:00
art
452b55ac6f fix 2020-02-18 20:38:12 -06:00
art
fe31db07f9 analysis.py v 1.1.12.003 2020-02-18 20:32:35 -06:00
art
52d79ea25e analysis.py v 1.1.12.002, superscript.py
v 0.0.0.003
2020-02-18 20:29:22 -06:00
art
20833b29c1 superscript.py v 0.0.0.002 2020-02-18 19:54:09 -06:00
art
978a9a9a25 doc 2020-02-18 16:16:57 -06:00
art
9da4322aa9 analysis.py v 1.1.12.000 2020-02-18 15:25:23 -06:00
art
5bdd77ddc6 superscript v 0.0.0.001 2020-02-18 11:31:20 -06:00
art
2782dc006c fix 2020-01-17 10:21:15 -06:00
art
de6c582b8f analysis.py v 1.1.11.010 2020-01-17 10:18:28 -06:00
art
32bc329e91 something changed idk what 2020-01-08 15:01:33 -06:00
art
4e50a79614 analysis.py v 1.1.11.009 2020-01-07 15:55:49 -06:00
ltcptgeneral
190fbf6cac analysis?py v 1.1.11.008 2020-01-06 23:48:28 -06:00
art
a8bf4e46e9 created superscript.py 2020-01-06 14:55:36 -06:00
ltcptgeneral
478c793917 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2020-01-05 19:08:06 -06:00
ltcptgeneral
4b44c7978a whatever 2020-01-05 19:06:54 -06:00
art
0fbb958dd9 regression v 1.0.0.003 2020-01-04 10:19:31 -06:00
art
031e45ac19 analysis.py v 1.1.11.007 2020-01-04 10:13:25 -06:00
art
96bf376b70 analysis.py v 1.1.11.006 2020-01-04 10:04:20 -06:00
art
eca8d4efc1 quick fix 2020-01-04 09:57:06 -06:00
art
d5a7f52b83 spelling 2019-12-23 12:49:38 -06:00
art
ae4ecbd67c analysis.py v 1.1.11.005 2019-12-23 12:48:13 -06:00
ltcptgeneral
0ba3a56ea7 analysis.py v 1.1.11.004 2019-11-16 16:21:06 -06:00
art
1717cc17a1 analysis.py 1.1.11.003 2019-11-11 10:04:12 -06:00
ltcptgeneral
947f7422dc spelling fix 2019-11-10 13:59:59 -06:00
ltcptgeneral
cf14005b67 analysis?py v 1.1.11.002 2019-11-10 02:04:48 -06:00
ltcptgeneral
08ff6aec8e analysis.py v 1.1.11.001 2019-11-10 01:38:39 -06:00
art
234f54ef5d analysis.py v 1.1.11.000 2019-11-08 13:20:38 -06:00
art
df42ae734e analysis.py v 1.1.10.00 2019-11-08 12:41:37 -06:00
art
4979c4b414 analysis.py v 1.1.9.002 2019-11-08 12:26:42 -06:00
art
d6cc419c40 test 2019-11-08 09:50:54 -06:00
ltcptgeneral
a73ce4080c quick fix 2019-11-06 15:33:56 -06:00
ltcptgeneral
456836bdb8 analysis.py 1.1.9.001 2019-11-06 15:32:21 -06:00
ltcptgeneral
a51f1f134d analysis.py v 1.1.9.000 2019-11-06 15:26:13 -06:00
art
db9ce0c25a quick fix 2019-11-05 16:25:53 -06:00
art
92c8b9c8c3 fixed indentation 2019-11-05 13:45:35 -06:00
art
06b0acb9f8 analysis.py v 1.1.8.000 2019-11-05 13:38:49 -06:00
art
7c957d9ddc analysis.py v 1.1.7.000 2019-11-05 13:14:08 -06:00
art
efab5bfde8 analysis.py v 1.1.6.002 2019-11-05 12:56:53 -06:00
art
c886ca8e3f analysis.py v 1.1.6.001 2019-11-05 12:53:39 -06:00
art
2cf7d73c9c analysis.py v 1.1.6.000 2019-11-05 12:47:04 -06:00
art
f12cbcc847 f 2019-11-04 10:14:28 -06:00
art
df6c184b84 quick fix 2019-11-04 10:10:29 -06:00
art
1ea7306eeb __all__ fixes 2019-11-04 10:08:28 -06:00
art
bb41c26531 something changed, idk 2019-11-01 13:12:01 -05:00
art
1d4b2bd49d visualization v 1.0.0.000, titanlearn v 2.0.1.001 2019-11-01 13:08:32 -05:00
art
8dd2440f08 analysis.py v 1.1.5.001 2019-10-31 11:03:52 -05:00
art
ab9b38da95 titanlearn v 2.0.1.000 2019-10-29 14:21:53 -05:00
art
dacf12f8a4 quick fix 2019-10-29 12:27:16 -05:00
art
3894eb481c fixes 2019-10-29 12:25:18 -05:00
art
0198d6896b restructured file management part 3 2019-10-29 10:53:11 -05:00
art
6902521d6b restructured file management part 2 2019-10-29 10:50:10 -05:00
art
590e8424e7 restructured file management 2019-10-29 10:37:23 -05:00
art
bc6916ab15 quick fix 2019-10-29 10:07:56 -05:00
art
2590a40827 depreciated files, titanlearn v 2.0.0.001 2019-10-29 10:04:56 -05:00
art
68006de8c0 titanlearn.py v 2.0.0.000 2019-10-29 09:41:49 -05:00
art
9f0d366408 depreciated 2019 superscripts and company 2019-10-29 09:23:00 -05:00
art
2bdb15a2b3 analysis.py v 1.1.5.001 2019-10-25 09:50:02 -05:00
art
56b575a753 analysis.py v 1,1,5,001 2019-10-25 09:19:18 -05:00
ltcptgeneral
ff2f0787ae analysis.py v 1.1.5.000 2019-10-09 23:58:08 -05:00
jlevine18
7c121d48fc fix PolyRegKernel 2019-10-09 22:23:56 -05:00
art
8eac3d5af1 ok fixed half of it 2019-10-08 13:49:19 -05:00
art
f47be637a0 jacob fix poly regression! 2019-10-08 13:35:32 -05:00
art
c824087335 removed extra import 2019-10-08 12:58:04 -05:00
art
a92dacc7ff added import math 2019-10-08 09:30:07 -05:00
art
37c3430433 removed regression import 2019-10-07 12:58:57 -05:00
ltcptgeneral
3bcf832db0 fix 2019-10-06 19:12:58 -05:00
art
591ddbde9d refactor 2019-10-05 16:53:03 -05:00
art
eaa0bcd5d8 quick fixes 2019-10-05 16:51:11 -05:00
art
45abb9e24d analysis.py v 1.1.4.000 2019-10-05 16:18:49 -05:00
art
a853e9b02b quick change 2019-10-04 10:37:29 -05:00
art
af20fb0fa7 comments 2019-10-04 10:36:44 -05:00
art
3a17ac5154 analysis.py v 1.1.3.002 2019-10-04 10:34:31 -05:00
art
1cdeab4b6b quick fix 2019-10-04 09:28:25 -05:00
art
b2ce781961 quick refactor of glicko2() 2019-10-04 09:12:12 -05:00
art
400b5bb81e upload trueskill for testing purposes 2019-10-04 09:02:46 -05:00
art
fd7ab3a598 analysis.py v 1.1.3.001 2019-10-04 08:13:28 -05:00
ltcptgeneral
9175c2921a analysis.py v 1.1.3.000 2019-10-04 00:26:21 -05:00
ltcptgeneral
1d3de02763 Merge pull request #3 from titanscout2022/elo
Elo
2019-10-03 11:22:57 -05:00
art
b6299ce397 analysis.py v 1.1.2.003 2019-10-03 10:48:56 -05:00
art
8801a300c4 analysis.py v 1.1.2.002 2019-10-03 10:42:05 -05:00
art
acdcb42e6d quick tests 2019-10-02 20:57:09 -05:00
art
484adfcda8 stuff 2019-10-02 20:56:06 -05:00
art
4d01067a57 analysis.py v 1.1.2.001 2019-10-01 08:59:04 -05:00
ltcptgeneral
0991757ddb reduced random blank lines 2019-09-30 16:09:31 -05:00
ltcptgeneral
de0cb1a4e3 analysis.py v 1.1.2.000, quick fixes 2019-09-30 16:02:32 -05:00
ltcptgeneral
bca13420b2 fixes 2019-09-30 15:49:15 -05:00
ltcptgeneral
236ca3bcfd quick fix 2019-09-30 13:41:15 -05:00
ltcptgeneral
b2aa6357d8 analysis.py v 1.1.1.001 2019-09-30 13:37:19 -05:00
ltcptgeneral
941dd4838a analysis.py v 1.1.1.000 2019-09-30 10:11:53 -05:00
ltcptgeneral
91d727b6ad jacob forgot self.scal_mult 2019-09-27 10:13:17 -05:00
ltcptgeneral
2c00f5b26e Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-09-27 09:49:40 -05:00
jlevine18
4f981df7bb Add files via upload 2019-09-27 09:48:05 -05:00
ltcptgeneral
c24e51e2b6 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-09-27 09:41:07 -05:00
ltcptgeneral
f565744867 added testing files to gitignore 2019-09-27 09:40:50 -05:00
ltcptgeneral
d3ee8621f0 spelling fix 2019-09-26 19:22:44 -05:00
jlevine18
e38c12f765 cudaregress v 1.0.0.002 2019-09-26 13:35:37 -05:00
jlevine18
d71b45a8e9 wait arthur moved this 2019-09-26 13:34:42 -05:00
jlevine18
6f9527c726 cudaregress 1.0.0.002 2019-09-26 13:31:22 -05:00
ltcptgeneral
9a99b8de2a quick fix 2019-09-25 14:14:17 -05:00
ltcptgeneral
c32b0150bd analysis.py v 1.1.0.007 2019-09-25 14:11:20 -05:00
ltcptgeneral
86327e97f9 moved and renamed cudaregress.py to regression.py 2019-09-23 09:58:08 -05:00
jlevine18
4fd18ec7fe global vars to bugfix 2019-09-23 09:28:35 -05:00
jlevine18
dc6f896071 Set device bc I apparently forgot to do that 2019-09-23 00:01:31 -05:00
jlevine18
c5d087dada don't need the testing notebook up here anymore 2019-09-22 23:23:29 -05:00
jlevine18
bda2db7003 Add files via upload 2019-09-22 23:22:21 -05:00
jlevine18
53d4a0ecde added cudaregress.py package 2019-09-22 23:19:46 -05:00
ltcptgeneral
db19127d28 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-09-22 23:10:24 -05:00
jlevine18
3ec7e5fed5 added cuda to cudaregress notebook 2019-09-22 23:05:49 -05:00
ltcptgeneral
8bd07cbd32 quick fix 2019-09-22 21:54:28 -05:00
jlevine18
f5b9a678fc fix cuda regress testing notebook 2019-09-22 21:38:12 -05:00
jlevine18
1c8f8fdfe7 added cudaRegress testing notebook 2019-09-21 13:35:51 -05:00
ltcptgeneral
f63c473166 analysis.py v 1.1.0.006 2019-09-17 12:21:44 -05:00
ltcptgeneral
936354a1a2 analysis.py v 1.1.0.005 2019-09-17 08:46:47 -05:00
ltcptgeneral
43d059b477 analysis.py v 1.1.0.004 2019-09-16 11:11:27 -05:00
ltcptgeneral
173f9b3460 benchmarked 2019-09-13 15:09:33 -05:00
ltcptgeneral
eb51d876a5 analysis.py v 1.1.0.003 2019-09-13 14:38:24 -05:00
ltcptgeneral
bee1edbf25 quick fixes 2019-09-13 14:29:22 -05:00
ltcptgeneral
13c17b092a analysis.py v 1.1.0.002 2019-09-13 13:59:13 -05:00
ltcptgeneral
800601121e moved files to subfolder dep 2019-09-13 13:50:12 -05:00
ltcptgeneral
79e77af304 analysis.py v 1.1.0.001 2019-09-13 12:33:02 -05:00
ltcptgeneral
4d6273fa05 analysis.py v 1.1.0.000 2019-09-13 11:14:13 -05:00
ltcptgeneral
c9567f0d7c Rename analysis-better.py to analysis.py 2019-09-12 11:05:33 -05:00
ltcptgeneral
37d3c2b1d2 Rename analysis.py to analysis-dep.py 2019-09-12 11:04:54 -05:00
ltcptgeneral
b689dada3d analysis-better.py v 1.0.9.000
changelog:
    - refactored
    - numpyed everything
    - removed stats in favor of numpy functions
2019-04-09 09:43:42 -05:00
ltcptgeneral
e914d32b37 Create analysis-better.py 2019-04-09 09:30:37 -05:00
ltcptgeneral
5dc3fa344c Delete temp.txt 2019-04-08 09:38:27 -05:00
ltcptgeneral
c7859bf681 Update .gitignore 2019-04-08 09:34:49 -05:00
ltcptgeneral
620b6de028 quick fixes 2019-04-08 09:26:32 -05:00
ltcptgeneral
c1635f79fe Merge branch 'c' 2019-04-08 09:17:26 -05:00
ltcptgeneral
a9d3ef2b51 Create analysis.cp37-win_amd64.pyd 2019-04-08 09:17:16 -05:00
ltcptgeneral
aa107249fd cython working 2019-04-08 09:16:26 -05:00
ltcptgeneral
0c47283dd5 analysis in c working 2019-04-05 21:01:17 -05:00
ltcptgeneral
f49bb58215 started c-ifying analysis 2019-04-05 17:24:24 -05:00
ltcptgeneral
b91ad29ae4 Delete uuh.png 2019-04-03 14:43:59 -05:00
ltcptgeneral
8a869e037b fixed superscript 2019-04-03 14:39:22 -05:00
ltcptgeneral
20f082b760 beautified 2019-04-03 13:34:31 -05:00
ltcptgeneral
ef81273d4a Delete keytemp.json 2019-04-02 14:07:24 -05:00
ltcptgeneral
3761274ee3 Update .gitignore 2019-04-02 13:43:08 -05:00
ltcptgeneral
506c779d82 Merge branch 'multithread' 2019-04-02 13:40:02 -05:00
ltcptgeneral
892b57a1eb whtever 2019-04-01 13:22:37 -05:00
jlevine18
94cc4adbf9 teams for wisconsin regional 2019-03-28 07:54:08 -05:00
ltcptgeneral
2e189bcfa2 teams added 2019-03-27 23:40:05 -05:00
ltcptgeneral
a21d0b5ec6 Update tbarequest.cpython-37.pyc 2019-03-22 19:40:17 -05:00
ltcptgeneral
ebb5f3b09e Update scores.csv 2019-03-22 19:11:11 -05:00
ltcptgeneral
5c4bed42d6 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2019-03-22 15:22:05 -05:00
ltcptgeneral
c15d037109 something changed 2019-03-22 15:21:58 -05:00
jlevine18
56f704c464 Update tbarequest.py 2019-03-22 15:09:52 -05:00
jlevine18
14a0414265 add req_team_info 2019-03-22 14:54:55 -05:00
Jacob Levine
6dbdfe00fc fixed textArea bug 2019-03-22 12:39:16 -05:00
ltcptgeneral
00c9df4239 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2019-03-22 11:54:43 -05:00
ltcptgeneral
9c725887c5 created nishant only script 2019-03-22 11:54:40 -05:00
Jacob Levine
9562cc594f fixed another bug 2019-03-22 11:53:15 -05:00
Jacob Levine
56f6752ff7 fixed textArea bug 2019-03-22 11:50:03 -05:00
Jacob Levine
31c8c9ee86 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-22 11:28:45 -05:00
Jacob Levine
0f671daf30 added fields that Arthut needed 2019-03-22 11:28:22 -05:00
Archan Das
e1027a9562 Update teams.csv 2019-03-22 11:12:51 -05:00
Jacob Levine
628dac5835 web3\ 2019-03-22 09:18:59 -05:00
Jacob Levine
fed48bc999 web3\ 2019-03-22 09:16:17 -05:00
Jacob Levine
7be5e15e9a web3 2019-03-22 09:14:41 -05:00
Jacob Levine
365b9e1882 web4 2019-03-22 09:13:44 -05:00
Jacob Levine
e5bb5b6ef7 web3 2019-03-22 09:12:29 -05:00
Jacob Levine
f9e4a6c53d web3 2019-03-22 08:51:42 -05:00
Jacob Levine
e0099aab60 web2 2019-03-22 08:50:37 -05:00
Jacob Levine
925087886c web 2019-03-22 08:49:50 -05:00
Jacob Levine
35f8cd693e archan needs to import! 2019-03-22 08:46:04 -05:00
Jacob Levine
a795a89c2d final fixes 2019-03-22 08:44:42 -05:00
Jacob Levine
92602b3122 change letter 2019-03-22 08:39:01 -05:00
Jacob Levine
aa86a2af7b final fixes 2019-03-22 08:36:33 -05:00
Jacob Levine
6f1cf1828a update archan's script 2019-03-22 08:27:53 -05:00
Jacob Levine
bb5c38fbfe don't sort matches alphabetically, sort them numerically 2019-03-22 07:48:44 -05:00
Jacob Levine
169c1737b2 testing mistakes 2019-03-22 07:37:05 -05:00
Jacob Levine
8244efa09b ok seriously what is going on? 2019-03-22 07:34:00 -05:00
Jacob Levine
4c1abeb200 testing mistakes 2019-03-22 07:32:05 -05:00
Jacob Levine
b41683eaa9 testing mistakes 2019-03-22 07:29:47 -05:00
Jacob Levine
1f29718795 testing mistakes 2019-03-22 07:28:55 -05:00
Jacob Levine
5716a7957e ok 2019-03-22 07:28:11 -05:00
Jacob Levine
7c21c277dd testing mistakes 2019-03-22 07:26:08 -05:00
Jacob Levine
91ddbb5531 wtf 2019-03-22 07:24:27 -05:00
Jacob Levine
21be310e1f testing mistakes 2019-03-22 07:24:15 -05:00
Jacob Levine
0a687648e0 Revert "ok seriously what is going on?"
This reverts commit 8de7078240.
2019-03-22 07:17:26 -05:00
Jacob Levine
80c6b9ba67 Revert "testing mistakes"
This reverts commit 1f20ad7f37.
2019-03-22 07:16:36 -05:00
Jacob Levine
1f20ad7f37 testing mistakes 2019-03-22 07:15:44 -05:00
Jacob Levine
8de7078240 ok seriously what is going on? 2019-03-22 07:05:45 -05:00
Jacob Levine
b88a7f7aa8 wtf 2019-03-22 07:03:31 -05:00
Jacob Levine
313d627fa8 testing mistakes 2019-03-22 07:01:19 -05:00
Jacob Levine
cbe1d9a015 wtf 2019-03-22 06:57:27 -05:00
Jacob Levine
a4288e2a0d move so it doesnt crash 2019-03-22 06:53:28 -05:00
Jacob Levine
4f631a4b79 fix add script for text areas 2019-03-22 06:48:45 -05:00
Jacob Levine
e992483f35 dont be stupid 2019-03-22 01:06:05 -05:00
Jacob Levine
4671eacb6e chrome is still horrible 2019-03-22 01:04:18 -05:00
Jacob Levine
c76a5ddb5e case sensitive 2019-03-22 00:50:20 -05:00
Jacob Levine
1989ec5ad4 chrome is still horrible 2019-03-22 00:49:19 -05:00
Jacob Levine
0124d9db97 chrome is horrible 2019-03-22 00:46:19 -05:00
Jacob Levine
02ce675a0b dont be stupid 2019-03-22 00:40:27 -05:00
Jacob Levine
64da97bfdc dont be stupid 2019-03-22 00:38:37 -05:00
Jacob Levine
23d6eebff1 dont be stupid 2019-03-22 00:36:12 -05:00
Jacob Levine
8dcb59a15c bugfix 23 2019-03-22 00:33:50 -05:00
Jacob Levine
ebe25312b5 af 2019-03-22 00:31:31 -05:00
Jacob Levine
28b5c6868e bugfix 22 2019-03-22 00:30:36 -05:00
Jacob Levine
0c09631813 st 2019-03-22 00:29:34 -05:00
Jacob Levine
23d821b773 bugfix 21 2019-03-22 00:27:44 -05:00
Jacob Levine
cc958e0927 bugfix 20 2019-03-22 00:24:23 -05:00
Jacob Levine
7e23641591 bugfix 19 2019-03-22 00:21:46 -05:00
Jacob Levine
dc8bc17324 bugfix 18 2019-03-22 00:20:01 -05:00
Jacob Levine
23b16d2e92 bugfix 16,17 2019-03-22 00:16:32 -05:00
Jacob Levine
5200dbc4d7 ian stopped naming his questions 2019-03-22 00:09:24 -05:00
Jacob Levine
3fd42c46c9 bugfix 15 2019-03-22 00:08:00 -05:00
Jacob Levine
19015d79e6 bugfix 14 2019-03-22 00:05:35 -05:00
Jacob Levine
3eef220768 ESCAPE STRINGS 2019-03-22 00:03:25 -05:00
Jacob Levine
2f088898f8 minor fixes 2019-03-22 00:01:01 -05:00
Jacob Levine
f091dd9113 bugfix 10 2019-03-21 23:58:46 -05:00
Jacob Levine
2a32386a9e dont be stupid 2019-03-21 23:53:16 -05:00
Jacob Levine
45055b1505 minor fixes 2019-03-21 23:45:25 -05:00
Jacob Levine
afcda88760 dont be stupid 2019-03-21 23:40:46 -05:00
Jacob Levine
3ad0dcd851 fix fix bugfix 6 2019-03-21 23:36:29 -05:00
Jacob Levine
ac32743210 fix bugfix 6 2019-03-21 23:35:14 -05:00
Jacob Levine
978342c480 bugfix 6 2019-03-21 23:33:11 -05:00
Jacob Levine
8e6a927032 bugfix 5 2019-03-21 23:31:50 -05:00
Jacob Levine
0bf52e1c29 bugfix 4 2019-03-21 23:30:38 -05:00
Jacob Levine
06242f0b2a remove random 'm' 2019-03-21 23:29:01 -05:00
Jacob Levine
4398de71ba website for peoria 2019-03-21 23:28:18 -05:00
Jacob Levine
15f504ecc3 typo! 2019-03-21 23:26:45 -05:00
Jacob Levine
06be451456 website for peoria 2019-03-21 23:25:24 -05:00
Jacob Levine
e498f4275e readded css 2019-03-21 23:17:24 -05:00
Jacob Levine
1633ef7862 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-21 23:13:12 -05:00
Jacob Levine
2cff74aa54 website for peoria 2019-03-21 23:12:49 -05:00
Archan Das
040b4dc52a Add files via upload 2019-03-21 22:52:08 -05:00
Archan Das
35d8e5ff77 Add files via upload 2019-03-21 22:50:27 -05:00
ltcptgeneral
d3b39d8167 Delete test.py 2019-03-21 22:17:33 -05:00
ltcptgeneral
c7b3d7e9a3 superscript v 1.0.6.001
changelog:
- fixed multiple bugs
- works now
2019-03-21 18:02:51 -05:00
ltcptgeneral
10f8839bbd WORKING 2019-03-21 17:52:59 -05:00
ltcptgeneral
1eb568c807 Revert "beautified"
This reverts commit 0d8780b3c1.
2019-03-21 17:50:52 -05:00
ltcptgeneral
12cf4a55d7 Revert "yeeted"
This reverts commit 1f2edeba51.
2019-03-21 17:50:46 -05:00
ltcptgeneral
e81f6052e3 Revert "stuff"
This reverts commit 268b01fc93.
2019-03-21 17:50:37 -05:00
ltcptgeneral
bbebc4350c Revert "no"
This reverts commit ac7c169a27.
2019-03-21 17:50:32 -05:00
ltcptgeneral
ac7c169a27 no 2019-03-21 17:43:36 -05:00
ltcptgeneral
268b01fc93 stuff 2019-03-21 17:34:27 -05:00
ltcptgeneral
9c7647aba9 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2019-03-21 17:28:23 -05:00
ltcptgeneral
1f2edeba51 yeeted 2019-03-21 17:28:16 -05:00
jlevine18
b3781ada45 Delete Untitled.ipynb 2019-03-21 17:28:04 -05:00
Jacob Levine
0d8780b3c1 beautified 2019-03-21 17:27:31 -05:00
ltcptgeneral
64a89cc58f WORKING!!!! 2019-03-21 17:25:16 -05:00
ltcptgeneral
f092bd3cb1 Update superscript.py 2019-03-21 17:00:38 -05:00
ltcptgeneral
c4309f5679 Update superscript.py 2019-03-21 16:59:29 -05:00
ltcptgeneral
e19bb8dcc1 1 2019-03-21 16:58:37 -05:00
ltcptgeneral
c9436f15f8 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2019-03-21 16:57:02 -05:00
jlevine18
8c867dcf95 Update superscript.py 2019-03-21 16:55:04 -05:00
ltcptgeneral
d3e98391d4 Create superscript.py 2019-03-21 16:52:37 -05:00
ltcptgeneral
ef336eb454 a 2019-03-21 16:52:22 -05:00
ltcptgeneral
12f5536026 wtf2 2019-03-21 16:50:32 -05:00
Jacob Levine
6a0d8f4144 fixed null removal script 2019-03-21 16:48:02 -05:00
ltcptgeneral
7f80339fb4 working 2019-03-21 16:17:45 -05:00
ltcptgeneral
9ea074c99c WTF 2019-03-21 15:59:47 -05:00
ltcptgeneral
4188b4b1c3 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2019-03-21 15:25:46 -05:00
ltcptgeneral
41ea4e9ed8 test 2019-03-21 15:25:36 -05:00
jlevine18
9fe9084341 add opr request 2019-03-21 15:23:24 -05:00
ltcptgeneral
9f894428c1 Update test.py 2019-03-21 15:14:24 -05:00
ltcptgeneral
4227106b4f Update superscript.py 2019-03-21 15:07:24 -05:00
ltcptgeneral
82754ede58 wtf 2019-03-21 15:06:54 -05:00
ltcptgeneral
a1d0cd37b7 test 2019-03-21 14:38:53 -05:00
ltcptgeneral
5e13ca3b5e Update test.py 2019-03-20 22:15:31 -05:00
ltcptgeneral
7c96233f5b Update test.py 2019-03-20 21:36:49 -05:00
ltcptgeneral
3ecf08cf9b too much iteration 2019-03-20 20:18:55 -05:00
ltcptgeneral
04c561baea Update issue templates 2019-03-20 18:14:59 -05:00
ltcptgeneral
5eaf733651 Update superscript.py 2019-03-20 18:14:32 -05:00
ltcptgeneral
d0435a5528 Create LICENSE 2019-03-20 17:41:10 -05:00
ltcptgeneral
55cc572d5c Create CONTRIBUTING.md 2019-03-20 17:37:38 -05:00
ltcptgeneral
8577d4dafa Update README.md 2019-03-20 17:33:47 -05:00
ltcptgeneral
08aec2537e fix 0 2019-03-20 17:23:41 -05:00
ltcptgeneral
975db73aae key fix? 2019-03-20 16:53:53 -05:00
ltcptgeneral
6cb09240ab Update superscript.py 2019-03-20 16:38:42 -05:00
ltcptgeneral
c74b0f34a6 superscript.py - v 1.0.6.000
changelog:
- added pulldata function
- service now pulls in, computes data, and outputs data as planned
2019-03-20 16:16:48 -05:00
ltcptgeneral
3e47a232cc 1234567890 2019-03-20 14:10:47 -05:00
Jacob Levine
2e356405e1 bugfix 16 2019-03-18 21:06:13 -05:00
Jacob Levine
f59d94282d bugfix 15 2019-03-18 21:02:23 -05:00
Jacob Levine
b0ad3bdf9c bugfix 14 2019-03-18 20:47:16 -05:00
Jacob Levine
733c7cbfe7 bugfix 13 2019-03-18 19:20:27 -05:00
Jacob Levine
a95684213c bugfix 12 2019-03-18 19:16:17 -05:00
Jacob Levine
76b4107999 bugfix 11 2019-03-18 19:15:17 -05:00
Jacob Levine
b2bb2df3f0 bugfix 10, now with template literals 2019-03-18 19:10:18 -05:00
Jacob Levine
3ec4de4fb1 bugfix 9 2019-03-18 18:57:43 -05:00
Jacob Levine
bf1572765c bugfix 8 2019-03-18 18:53:41 -05:00
Jacob Levine
0717ed4979 bugfix 7 2019-03-18 18:41:20 -05:00
Jacob Levine
ab421e4170 bugfix 4 2019-03-18 18:38:45 -05:00
Jacob Levine
c5f6ecae68 bugfix 5 2019-03-18 18:35:59 -05:00
Jacob Levine
7dbffc940a bugfix 4 2019-03-18 18:28:47 -05:00
Jacob Levine
8f4e6e3510 bugfix 3 2019-03-18 18:27:46 -05:00
Jacob Levine
86325e7d2b bugfixes 2 2019-03-18 18:06:11 -05:00
Jacob Levine
cf6c6180d3 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-18 17:55:56 -05:00
Jacob Levine
da315ac908 bugfix 1 2019-03-18 17:54:34 -05:00
jlevine18
3b95963eb1 Merge pull request #1 from titanscout2022/signUps
Sign ups demo
2019-03-18 17:21:40 -05:00
Jacob Levine
1fdd80e31b multiform demo mk 1 2019-03-18 17:13:45 -05:00
Jacob Levine
926db38db9 continue with multi-form 2019-03-17 23:27:46 -05:00
Jacob Levine
f483cbbcfb Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-16 15:49:49 -05:00
Jacob Levine
0d111296af changed to signups. not complete yet 2019-03-16 15:47:56 -05:00
ltcptgeneral
9fb53f4297 Update titanlearn.py 2019-03-16 13:12:59 -05:00
ltcptgeneral
69ef08bfd4 1234567890 2019-03-10 11:42:43 -05:00
ltcptgeneral
0159f116c1 12345678 2019-03-09 16:27:36 -06:00
Jacob Levine
da6f2ce044 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-09 14:08:38 -06:00
Jacob Levine
053001186e added frc elo notebook 2019-03-09 14:05:47 -06:00
jlevine18
177e8ad783 Delete pullmatches.py 2019-03-08 22:19:11 -06:00
Jacob Levine
047f682030 added scoreboard 2019-03-08 22:05:35 -06:00
Jacob Levine
041db246b1 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-08 21:56:15 -06:00
Jacob Levine
54888a3988 added day 1 processing 2019-03-08 21:55:52 -06:00
ltcptgeneral
c726551ec7 Update superscript.py 2019-03-08 19:00:02 -06:00
ltcptgeneral
a36ba0413a superscript v 1.0.5.003
changelog:
- hotfix: actually pushes data correctly now
2019-03-08 17:43:38 -06:00
Jacob Levine
79d0bda1ef fix defaults 2019-03-08 12:54:41 -06:00
Jacob Levine
a7def3c367 reworked questions to comply with Ian's app 2019-03-08 12:48:10 -06:00
Jacob Levine
1ee9867ea6 fix typo 2019-03-08 10:54:14 -06:00
Jacob Levine
44f209f331 added strat options 2019-03-08 10:47:49 -06:00
Jacob Levine
274017806f sets timeout for reload 2019-03-07 23:37:54 -06:00
Jacob Levine
90adb6539a final fix for the night! 2019-03-07 23:33:58 -06:00
Jacob Levine
be4ec9ea51 bugfix 2019-03-07 23:30:33 -06:00
Jacob Levine
b89fab51c3 fix typo 2019-03-07 23:29:16 -06:00
Jacob Levine
6247c7997f added full functionality to scout 2019-03-07 23:26:30 -06:00
Jacob Levine
9baa4450b0 stylinh 2019-03-07 21:25:32 -06:00
Jacob Levine
2a449eba1a one of these times im going to actually catch it 2019-03-07 21:22:04 -06:00
Jacob Levine
dfd5366112 fix typo 2019-03-07 21:21:12 -06:00
Jacob Levine
dc180862df fix typo 2019-03-07 21:20:07 -06:00
Jacob Levine
9d9dcbbb71 fix typo 2019-03-07 21:18:13 -06:00
Jacob Levine
ed151f1707 sections 2019-03-07 21:16:54 -06:00
Jacob Levine
302f6b794d bugfix 2019-03-07 20:55:49 -06:00
Jacob Levine
1925943660 start scout 2019-03-07 20:54:55 -06:00
Jacob Levine
0e358a9a14 final fixes (hopefully this time) 2019-03-07 20:21:05 -06:00
Jacob Levine
2c9e553b57 fix typo 2019-03-07 20:19:58 -06:00
Jacob Levine
ee4ee316dd final page fix 2019-03-07 20:18:54 -06:00
Jacob Levine
12e39ecc84 fix typo 2019-03-07 20:17:10 -06:00
Jacob Levine
eb20ad907e fix mistake 2019-03-07 20:16:14 -06:00
Jacob Levine
61b286c258 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-07 20:14:05 -06:00
Jacob Levine
77231d00cc now you can leave teams 2019-03-07 20:13:32 -06:00
jlevine18
4322396088 arthur don't be stupid 2019-03-07 20:03:20 -06:00
Jacob Levine
c5dc49f442 final profile fix 2019-03-07 19:57:20 -06:00
Jacob Levine
0684f982b7 fix structure 2019-03-07 19:55:30 -06:00
Jacob Levine
b5d8851c44 fix data structure 2019-03-07 19:48:50 -06:00
Jacob Levine
b0782ed74e test bugfix 2019-03-07 19:47:35 -06:00
Jacob Levine
3e76c55801 testing... 2019-03-07 19:46:01 -06:00
Jacob Levine
834068244e test bugfix 2019-03-07 19:43:50 -06:00
Jacob Levine
d833d0a183 fix typo 2019-03-07 19:38:05 -06:00
Jacob Levine
1f50c6dd16 test bugfix 2019-03-07 19:37:06 -06:00
Jacob Levine
9ca336934a Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-03-07 19:32:14 -06:00
Jacob Levine
251390fddf fixed teamlogic 2019-03-07 19:31:34 -06:00
ltcptgeneral
aaa548fb65 hotfix 2000 2019-03-07 09:14:20 -06:00
ltcptgeneral
7710da503b 12 2019-03-06 20:05:50 -06:00
ltcptgeneral
18969b4179 Update superscript.py 2019-03-05 13:36:47 -06:00
ltcptgeneral
ecb6400b06 lotta bug fixes 2019-03-04 16:38:40 -06:00
ltcptgeneral
67393e0e09 1 2019-03-03 22:50:29 -06:00
ltcptgeneral
442d9a9682 Update analysis.py 2019-03-02 20:18:51 -06:00
ltcptgeneral
7434263165 titanscouting app v 1.0.0.003
simple bug fix
2019-03-02 19:58:00 -06:00
ltcptgeneral
d20d0e4e7a titanscouting app v 1.0.0.002 2019-03-02 19:47:31 -06:00
ltcptgeneral
836abc427a ryiop 2019-03-02 16:34:48 -06:00
ltcptgeneral
8cc6b2774e Create README.md 2019-03-02 16:34:12 -06:00
jlevine18
e98e66bdf0 tl.py 2019-03-02 08:18:28 -06:00
ltcptgeneral
791c4e82a5 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2019-03-01 13:49:36 -06:00
ltcptgeneral
110da31d50 Update titanlearn.py 2019-03-01 13:49:33 -06:00
jlevine18
0e9a706904 Update titanlearn.py 2019-03-01 12:25:41 -06:00
ltcptgeneral
28b5f9d6a2 dumb 2019-03-01 12:18:38 -06:00
ltcptgeneral
00af69a3f5 Update superscript.py 2019-02-28 13:39:35 -06:00
ltcptgeneral
e61403174d sfasf 2019-02-28 13:28:29 -06:00
ltcptgeneral
632a2472a2 bassbsabjasb 2019-02-28 13:13:52 -06:00
ltcptgeneral
d62a07a69e Update superscript.py 2019-02-28 09:04:37 -06:00
ltcptgeneral
85d4a29cf2 Update superscript.py 2019-02-27 14:01:25 -06:00
ltcptgeneral
6678e49cbf superscript.py - v 1.0.5.002
changelog:
- more information given
- performance improvements
2019-02-27 14:00:29 -06:00
ltcptgeneral
839c5d2943 superscript.py - v 1.0.5.001
changelog:
- grammar
2019-02-27 13:43:33 -06:00
ltcptgeneral
79b4cf1158 superscript.py - v 1.0.5.000
changelog:
- service now iterates forever
- ready for production other than pulling json data
2019-02-27 13:38:24 -06:00
ltcptgeneral
9b9d6bcd23 superscript.py - v 1.0.4.001
changelog:
- grammar fixes
2019-02-26 23:18:26 -06:00
ltcptgeneral
2b1dd3ed9b superscript.py - v 1.0.4.000
changelog:
- actually pushes to firebase
2019-02-26 19:39:56 -06:00
ltcptgeneral
7afe68e315 Update .gitignore 2019-02-26 19:10:53 -06:00
ltcptgeneral
0f58ce0fd7 security patch 2019-02-22 12:23:49 -06:00
ltcptgeneral
badcb373ae Update bdata.csv 2019-02-21 12:33:13 -06:00
ltcptgeneral
e5cf8a43d4 superscript.py - v 1.
changelog:
- processes data more efficiently
2019-02-20 22:59:17 -06:00
ltcptgeneral
aba4b44da4 superscript.py - v 1.0.3.000
changelog:
- actually processes data
2019-02-20 11:44:11 -06:00
ltcptgeneral
c4fa9c5f23 qwertyuiop 2019-02-19 13:21:06 -06:00
ltcptgeneral
22688de9e8 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2019-02-19 09:44:55 -06:00
ltcptgeneral
042efb2b5a superscript.py - v 1.0.2.000
changelog:
- added data reading from folder
- nearly crashed computer reading from 20 GiB of data
2019-02-19 09:44:51 -06:00
Jacob Levine
060a77f4b7 fix more typos 2019-02-12 21:00:43 -06:00
Jacob Levine
ffd64eb3d2 fix typos 2019-02-12 21:00:00 -06:00
Jacob Levine
4822be0ece fix typos 2019-02-12 20:55:56 -06:00
Jacob Levine
d3b71287c4 squash bugh 2019-02-12 20:52:03 -06:00
Jacob Levine
67ac98b9ab fix more typos 2019-02-12 20:49:23 -06:00
Jacob Levine
9e0c6e36ee can i set the world record for most typos 2019-02-12 20:48:35 -06:00
Jacob Levine
d0d431fb54 fix even more typos 2019-02-12 20:46:23 -06:00
Jacob Levine
718ca83a1d fix more typos 2019-02-12 20:44:21 -06:00
Jacob Levine
e0c159de00 fix typos 2019-02-12 20:42:49 -06:00
Jacob Levine
6652918ae8 I apparently don't know how to js 2019-02-12 20:41:43 -06:00
Jacob Levine
4f3ecf4361 fix more typos 2019-02-12 20:37:50 -06:00
Jacob Levine
dd5da3b1e8 fix typos 2019-02-12 20:34:05 -06:00
Jacob Levine
45a4387c68 started teams page 2019-02-12 20:20:30 -06:00
Jacob Levine
c6b2840e07 last style fixed before i do something else, for real this time 2019-02-09 15:53:39 -06:00
Jacob Levine
6362f50fd3 last style fixed before i do something else, for real this time 2019-02-09 15:50:34 -06:00
Jacob Levine
d5622c8672 last style fixed before i do something eks 2019-02-09 15:49:21 -06:00
Jacob Levine
3abc50cf7a js dom terms aren't very consistent 2019-02-09 15:44:46 -06:00
Jacob Levine
0f68468f14 fix style inconsistencies 2019-02-09 15:42:16 -06:00
Jacob Levine
6d45200ca3 other style 2019-02-09 15:36:59 -06:00
Jacob Levine
80aee80548 other style 2019-02-09 15:30:27 -06:00
Jacob Levine
3d27f3c127 margins aren't for tables 2019-02-09 15:29:21 -06:00
Jacob Levine
9fd7966c55 other style updates 2019-02-09 15:27:17 -06:00
Jacob Levine
4529ee32e2 no but this ugly html hack should 2019-02-09 15:25:25 -06:00
Jacob Levine
3a5629f0ba does making everything auto fix it? 2019-02-09 15:19:14 -06:00
Jacob Levine
fe74aea4de maybe we can fix it in js 2019-02-09 15:12:17 -06:00
Jacob Levine
76ac58dbab maybe we can fix it in js 2019-02-09 15:10:24 -06:00
Jacob Levine
db0ddec2c6 overflow-x 2019-02-09 14:57:55 -06:00
Jacob Levine
c6980ff71d time to actually start making this look legit 2019-02-09 14:54:03 -06:00
Jacob Levine
a4840003f5 what was i thinking? 2019-02-09 14:46:59 -06:00
Jacob Levine
aad41e57a9 even more styling, if you can call it that 2019-02-09 14:43:14 -06:00
Jacob Levine
24a8500588 more styling, if you can call it that 2019-02-09 14:41:31 -06:00
Jacob Levine
63c69ecc14 styling, if you can call it that 2019-02-09 14:39:32 -06:00
Jacob Levine
1c775fca2c you can now actually see the profile update page 2019-02-09 14:34:01 -06:00
Jacob Levine
1073bc458a typo fix 2019-02-09 14:32:52 -06:00
Jacob Levine
f8dafe61f8 revamped profile page 2019-02-09 14:30:58 -06:00
Jacob Levine
c97e51d9bd even more bugfix 2019-02-09 14:01:32 -06:00
Jacob Levine
2e779a95d2 more bugfix 2019-02-09 14:00:50 -06:00
Jacob Levine
0c609064a6 bugfix 2019-02-09 13:59:23 -06:00
Jacob Levine
059509e018 revamped sign-in, now that we have working checks 2019-02-09 13:57:48 -06:00
Jacob Levine
2c9951d2c9 ok this should fix 2019-02-09 13:33:14 -06:00
Jacob Levine
290110274b even more of a last-ditch effort to make js not multithread everything 2019-02-09 13:32:06 -06:00
Jacob Levine
7d02c6373c even more of a last-ditch effort to make js not multithread everything 2019-02-09 13:04:12 -06:00
Jacob Levine
0b0d36d660 last-ditch effort to make js not multithread everything 2019-02-09 13:01:14 -06:00
Jacob Levine
807c66dd3a ok this should fix 2019-02-09 12:46:20 -06:00
Jacob Levine
f0c0d646b5 ok this should fix 2019-02-09 12:41:52 -06:00
Jacob Levine
390f3d9c4d rephrased check script. are you happy now, JS? 2019-02-09 12:31:25 -06:00
Jacob Levine
19a9995875 i apparently can't type 2019-02-09 12:17:47 -06:00
Jacob Levine
95eab24247 adding standalone profile page 2019-02-09 12:14:55 -06:00
Jacob Levine
3da5a0cbd7 adding timeout 2019-02-09 11:43:47 -06:00
Jacob Levine
447e3e12a3 apperently window loads too fast for firebase 2019-02-09 11:38:57 -06:00
Jacob Levine
5b922fc10b squashing bugs 2019-02-09 11:33:24 -06:00
Jacob Levine
e661af1add Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-02-09 11:29:03 -06:00
Jacob Levine
192d023325 testing signout logic 2019-02-09 11:27:46 -06:00
ltcptgeneral
6b91fe9819 fixed copy paste oppsie 2019-02-08 15:42:33 -06:00
Jacob Levine
82231cb04b styling fixes 2019-02-06 18:20:31 -06:00
Jacob Levine
39dc72add2 onload scripts 2019-02-06 18:19:18 -06:00
Jacob Levine
ac158bf0a9 bugfixes 2019-02-06 18:12:39 -06:00
Jacob Levine
7b2915f4f2 styling fixes 2019-02-06 18:09:47 -06:00
Jacob Levine
64354dbe19 Merge branch 'master' of https://github.com/titanscout2022/tr2022-strategy 2019-02-06 17:52:37 -06:00
Jacob Levine
901c8d25f8 added 3 other pages 2019-02-06 17:51:58 -06:00
ltcptgeneral
b346b01223 android app v 1.0.0.001 2019-02-06 17:43:38 -06:00
ltcptgeneral
73b419dfd6 android app v 1.0.0.000
finished android app
published source code
2019-02-06 17:06:25 -06:00
Jacob Levine
48f34f0472 revert some changes 2019-02-06 16:50:39 -06:00
Jacob Levine
e1769235f3 more styling 2019-02-06 16:45:56 -06:00
Jacob Levine
ac00138ca8 styling 2019-02-06 16:42:15 -06:00
Jacob Levine
28b5801bcc added sidebar 2019-02-06 16:21:41 -06:00
Jacob Levine
f2ed8ab04c sizing 2019-02-06 16:17:07 -06:00
Jacob Levine
781b4dc8b5 bugfix 2019-02-06 16:14:39 -06:00
Jacob Levine
19a236251a added sidebar 2019-02-06 16:08:28 -06:00
Jacob Levine
0d481b01df bugfix 2019-02-06 15:55:22 -06:00
jlevine18
5de2528d34 more bugfix 2019-02-06 15:37:27 -06:00
Jacob Levine
317ca72377 added info change functionality 2019-02-06 15:35:51 -06:00
Jacob Levine
c6e719240a bugfix 2019-02-06 15:25:15 -06:00
Jacob Levine
e554a1df99 reworked fix profile info 2019-02-06 15:22:09 -06:00
Jacob Levine
d9e7a1ed1e testing bugs 2019-02-06 15:04:31 -06:00
Jacob Levine
d968f10737 bugfix 2019-02-06 14:56:17 -06:00
Jacob Levine
dc80127dee bugfix 2019-02-06 14:51:31 -06:00
Jacob Levine
c591c84c75 added info change functionality 2019-02-06 14:46:41 -06:00
Jacob Levine
e290f5ae11 layout changes 2019-02-06 14:15:59 -06:00
Jacob Levine
b8d209b283 new fixes 2019-02-06 13:57:29 -06:00
Jacob Levine
f195b81974 added profile change functionality 2019-02-06 13:24:56 -06:00
ltcptgeneral
1293de346e analysis.py v 1.0.8.005, superscript.py v 1.0.1.000
changelog analysis.py:
- minor fixes
changelog superscript.py:
- added data reading from file
- added superstructure to code
2019-02-05 09:50:10 -06:00
ltcptgeneral
1b41c409cc created superscript.py, tbarequest.py v 1.0.1.000, edited repack_json.py
changelog tbarequest.py:
- fixed a simple error
2019-02-05 09:42:00 -06:00
ltcptgeneral
38d471113f Update .gitignore 2019-02-05 09:02:04 -06:00
ltcptgeneral
b31beb25be oof^2 2019-02-04 12:33:25 -06:00
ltcptgeneral
e3db22d262 Delete temp.txt 2019-02-04 10:50:43 -06:00
ltcptgeneral
e2d2e6687f oof 2019-02-04 10:50:07 -06:00
ltcptgeneral
b64ec05134 removed app bc jacob did fancy shit 2019-01-26 10:45:19 -06:00
ltcptgeneral
511e627899 Update workspace.xml 2019-01-26 10:40:35 -06:00
ltcptgeneral
ab0b2b9992 initialized app project 2019-01-26 10:32:00 -06:00
ltcptgeneral
0021eed5fb analysis.py - v 1.0.8.004
changelog
- removed a few unused dependencies
2019-01-26 10:11:54 -06:00
ltcptgeneral
8c35d8a3f6 yeeted histo_analysis_old() due to depreciation 2019-01-23 09:09:14 -06:00
ltcptgeneral
e5420844de yeeted useless comments 2019-01-22 22:42:37 -06:00
jlevine18
0fca5f58db ApiKey now changed and hidden-don't be stupid jake 2019-01-06 13:41:15 -06:00
Jacob Levine
07880038b0 folder move fix 2019-01-06 13:18:01 -06:00
Jacob Levine
d2d5d4c04e push all website files 2019-01-06 13:14:45 -06:00
jlevine18
d7301e26c3 Add files via upload 2019-01-06 13:02:35 -06:00
jlevine18
752b981e37 Rename website/functions/acorn to website/functions/node_modules/.bin/acorn 2019-01-06 12:57:46 -06:00
jlevine18
5f2db375f3 Add files via upload 2019-01-06 12:56:49 -06:00
jlevine18
cac1b4fba4 Add files via upload 2019-01-06 12:55:50 -06:00
jlevine18
236c4d02b6 Create index.js 2019-01-06 12:55:31 -06:00
jlevine18
8645eace5b Delete style.css 2019-01-06 12:54:41 -06:00
jlevine18
47cce54b3b Delete scripts.js 2019-01-06 12:54:35 -06:00
jlevine18
5a0fe35f86 Delete index.html 2019-01-06 12:54:29 -06:00
jlevine18
d3f8b474d0 upload website 2019-01-06 12:54:08 -06:00
ltcptgeneral
27145495e7 Update analysis.docs 2018-12-30 16:49:44 -06:00
ltcptgeneral
1a8da3fdd5 analysis.py - v 1.0.8.003
changelog:
- added p_value function
2018-12-29 16:28:41 -06:00
ltcptgeneral
444bfb5945 stuff 2018-12-26 17:08:04 -06:00
ltcptgeneral
cfee240e9c pineapple 2018-12-26 12:37:49 -06:00
ltcptgeneral
83a1dd5ced orange 2018-12-26 12:22:31 -06:00
ltcptgeneral
bf75e804cc bannana 2018-12-26 12:22:17 -06:00
ltcptgeneral
83e4f60a37 apple 2018-12-26 12:21:44 -06:00
bearacuda13
ae11605013 Add files via upload 2018-12-26 12:18:40 -06:00
bearacuda13
08b336cf15 Add files via upload 2018-12-26 12:14:05 -06:00
ltcptgeneral
eeeec86be6 temp 2018-12-26 12:06:42 -06:00
ltcptgeneral
9dbd897323 analysis.py - v 1.0.8.002
changelog:
- updated __all__ correctly to contain changes made in v 1.0.8.000 and v 1.0.8.001
2018-12-24 16:44:03 -06:00
jlevine18
71337c0fd5 fix other stupid mistakes 2018-12-24 14:50:04 -06:00
jlevine18
4e015180b6 fix syntax error 2018-12-24 14:42:54 -06:00
jlevine18
70591bc581 started ML module 2018-12-24 09:32:25 -06:00
jlevine18
288f97a3fd visualizer.py is now visualization.py 2018-12-21 11:10:18 -06:00
jlevine18
1126373bf2 Update tbarequest.py 2018-12-21 11:07:21 -06:00
jlevine18
fd0d43d29c added TBA requests module 2018-12-21 11:04:46 -06:00
jlevine18
cc6a7697cf Update visualization.py 2018-12-20 22:01:28 -06:00
jlevine18
2140ea8f77 started visualization module 2018-12-20 21:45:05 -06:00
ltcptgeneral
9dd5cc76f6 analysis.py - v 1.0.8.001
changelog:
- refactors
- bugfixes
2018-12-20 20:49:09 -06:00
ltcptgeneral
7b1e54eed8 refactor analysis.py 2018-12-20 15:05:43 -06:00
ltcptgeneral
188a7bbf1f Update data.csv 2018-12-20 12:21:26 -06:00
ltcptgeneral
b7a0c5286a analysis.py - v 1.0.8.000
changelog:
- depreciated histo_analysis_old
- depreciated debug
- altered basic_analysis to take array data instead of filepath
- refactor
- optimization
2018-12-20 12:21:22 -06:00
ltcptgeneral
32a2d6321c no change 2018-12-13 08:57:19 -06:00
ltcptgeneral
d2f6961693 Update analysis.cpython-37.pyc 2018-12-07 16:56:09 -06:00
ltcptgeneral
107076ac35 added visualizer.py, reorganized folders 2018-12-05 11:31:38 -06:00
ltcptgeneral
0b73460446 Update analysis.cpython-37.pyc 2018-12-04 19:05:13 -06:00
ltcptgeneral
39d5522650 Update analysis_docs.txt 2018-12-01 22:34:30 -06:00
ltcptgeneral
68d6c87589 Update analysis_docs.txt 2018-12-01 22:13:19 -06:00
ltcptgeneral
222c536631 created docs 2018-12-01 21:02:53 -06:00
ltcptgeneral
bd3f695938 a 2018-12-01 14:51:50 -06:00
ltcptgeneral
1b1a7c45bf Update analysis.cpython-37.pyc 2018-12-01 14:51:38 -06:00
ltcptgeneral
8a58fe28fa analysis.py - v 1.0.7.002
changelog:
	- bug fixes
2018-11-29 12:58:53 -06:00
ltcptgeneral
9c67e6f927 analysis.py - v 1.0.7.001
changelog:
	- bug fixes
2018-11-29 12:36:25 -06:00
ltcptgeneral
8d2dedc5a2 update analysis.py 2018-11-29 09:33:18 -06:00
ltcptgeneral
944cb31883 Update analysis.py
a quick update
2018-11-29 09:32:27 -06:00
ltcptgeneral
b38ffe1f08 Update requirements.txt 2018-11-29 09:31:55 -06:00
ltcptgeneral
19f89d3f35 updated stuff 2018-11-29 09:27:08 -06:00
ltcptgeneral
504fc92feb Create analysis.cpython-37.pyc 2018-11-29 09:04:17 -06:00
ltcptgeneral
5eb5e5ed8e removes stuff 2018-11-29 09:00:47 -06:00
ltcptgeneral
88be42de45 removed generate_data.py 2018-11-29 08:53:41 -06:00
ltcptgeneral
704a2d5808 analysis.py - v 1.0.7.000
changelog:
        - added tanh_regression (logistical regression)
	- bug fixes
2018-11-28 16:35:47 -06:00
ltcptgeneral
e915fe538e analysis.py - v 1.0.6.005
changelog:
        - added z_normalize function to normalize dataset
	- bug fixes
2018-11-28 14:29:32 -06:00
ltcptgeneral
5295bef18b Update analysis.cpython-37.pyc 2018-11-28 11:35:21 -06:00
ltcptgeneral
ae69eb7a40 Merge branch 'master' of https://github.com/ltcptgeneral/tr2022-strategy 2018-11-28 11:12:53 -06:00
jlevine18
46f434b815 started website 2018-11-28 11:10:38 -06:00
jlevine18
cce111bd6a Create index.html 2018-11-28 11:06:04 -06:00
30 changed files with 1234 additions and 348 deletions

View File

@@ -1,6 +1,7 @@
FROM python:slim FROM ubuntu:20.04
WORKDIR / WORKDIR /
RUN apt-get -y update; apt-get -y upgrade RUN apt-get -y update
RUN apt-get -y install git RUN DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends tzdata
COPY requirements.txt . RUN apt-get install -y python3 python3-dev git python3-pip python3-kivy python-is-python3 libgl1-mesa-dev build-essential
RUN pip install -r requirements.txt RUN ln -s $(which pip3) /usr/bin/pip
RUN pip install pymongo pandas numpy scipy scikit-learn matplotlib pylint kivy

View File

@@ -0,0 +1,2 @@
FROM titanscout2022/tra-analysis-base:latest
WORKDIR /

View File

@@ -1,22 +1,28 @@
{ {
"name": "TRA Analysis Development Environment", "name": "TRA Analysis Development Environment",
"build": { "build": {
"dockerfile": "Dockerfile", "dockerfile": "dev-dockerfile",
}, },
"settings": { "settings": {
"terminal.integrated.shell.linux": "/bin/bash", "terminal.integrated.shell.linux": "/bin/bash",
"python.pythonPath": "", "python.pythonPath": "/usr/local/bin/python",
"python.linting.enabled": true, "python.linting.enabled": true,
"python.linting.pylintEnabled": true, "python.linting.pylintEnabled": true,
"python.linting.pylintPath": "", "python.formatting.autopep8Path": "/usr/local/py-utils/bin/autopep8",
"python.testing.pytestPath": "", "python.formatting.blackPath": "/usr/local/py-utils/bin/black",
"editor.tabSize": 4, "python.formatting.yapfPath": "/usr/local/py-utils/bin/yapf",
"editor.insertSpaces": false "python.linting.banditPath": "/usr/local/py-utils/bin/bandit",
"python.linting.flake8Path": "/usr/local/py-utils/bin/flake8",
"python.linting.mypyPath": "/usr/local/py-utils/bin/mypy",
"python.linting.pycodestylePath": "/usr/local/py-utils/bin/pycodestyle",
"python.linting.pydocstylePath": "/usr/local/py-utils/bin/pydocstyle",
"python.linting.pylintPath": "/usr/local/py-utils/bin/pylint",
"python.testing.pytestPath": "/usr/local/py-utils/bin/pytest"
}, },
"extensions": [ "extensions": [
"mhutchie.git-graph", "mhutchie.git-graph",
"ms-python.python", "ms-python.python",
"waderyan.gitblame" "waderyan.gitblame"
], ],
"postCreateCommand": "" "postCreateCommand": "/usr/bin/pip3 install -r ${containerWorkspaceFolder}/analysis-master/requirements.txt && /usr/bin/pip3 install --no-cache-dir pylint && /usr/bin/pip3 install pytest"
} }

View File

@@ -1,8 +0,0 @@
numpy
scipy
scikit-learn
six
pyparsing
pylint
pytest

View File

@@ -10,12 +10,12 @@ on:
branches: [ master ] branches: [ master ]
jobs: jobs:
unittest: build:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy: strategy:
matrix: matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"] python-version: [3.7, 3.8]
env: env:
working-directory: ./analysis-master/ working-directory: ./analysis-master/

View File

@@ -2,7 +2,5 @@ numpy
scipy scipy
scikit-learn scikit-learn
six six
pyparsing matplotlib
pyparsing
pylint
pytest

View File

@@ -5,11 +5,9 @@ from sklearn import metrics
from tra_analysis import Analysis as an from tra_analysis import Analysis as an
from tra_analysis import Array from tra_analysis import Array
from tra_analysis import ClassificationMetric from tra_analysis import ClassificationMetric
from tra_analysis import Clustering
from tra_analysis import CorrelationTest from tra_analysis import CorrelationTest
from tra_analysis import Fit from tra_analysis import Fit
from tra_analysis import KNN from tra_analysis import KNN
from tra_analysis import metrics as m
from tra_analysis import NaiveBayes from tra_analysis import NaiveBayes
from tra_analysis import RandomForest from tra_analysis import RandomForest
from tra_analysis import RegressionMetric from tra_analysis import RegressionMetric
@@ -28,7 +26,7 @@ x_data_circular = []
y_data_circular = [] y_data_circular = []
y_data_ccu = [1, 3, 7, 14, 21] y_data_ccu = [1, 3, 7, 14, 21]
y_data_ccd = [8.66, 8.5, 7, 5, 1] y_data_ccd = [1, 5, 7, 8.5, 8.66]
test_data_scrambled = [-32, 34, 19, 72, -65, -11, -43, 6, 85, -17, -98, -26, 12, 20, 9, -92, -40, 98, -78, 17, -20, 49, 93, -27, -24, -66, 40, 84, 1, -64, -68, -25, -42, -46, -76, 43, -3, 30, -14, -34, -55, -13, 41, -30, 0, -61, 48, 23, 60, 87, 80, 77, 53, 73, 79, 24, -52, 82, 8, -44, 65, 47, -77, 94, 7, 37, -79, 36, -94, 91, 59, 10, 97, -38, -67, 83, 54, 31, -95, -63, 16, -45, 21, -12, 66, -48, -18, -96, -90, -21, -83, -74, 39, 64, 69, -97, 13, 55, 27, -39] test_data_scrambled = [-32, 34, 19, 72, -65, -11, -43, 6, 85, -17, -98, -26, 12, 20, 9, -92, -40, 98, -78, 17, -20, 49, 93, -27, -24, -66, 40, 84, 1, -64, -68, -25, -42, -46, -76, 43, -3, 30, -14, -34, -55, -13, 41, -30, 0, -61, 48, 23, 60, 87, 80, 77, 53, 73, 79, 24, -52, 82, 8, -44, 65, 47, -77, 94, 7, 37, -79, 36, -94, 91, 59, 10, 97, -38, -67, 83, 54, 31, -95, -63, 16, -45, 21, -12, 66, -48, -18, -96, -90, -21, -83, -74, 39, 64, 69, -97, 13, 55, 27, -39]
test_data_sorted = [-98, -97, -96, -95, -94, -92, -90, -83, -79, -78, -77, -76, -74, -68, -67, -66, -65, -64, -63, -61, -55, -52, -48, -46, -45, -44, -43, -42, -40, -39, -38, -34, -32, -30, -27, -26, -25, -24, -21, -20, -18, -17, -14, -13, -12, -11, -3, 0, 1, 6, 7, 8, 9, 10, 12, 13, 16, 17, 19, 20, 21, 23, 24, 27, 30, 31, 34, 36, 37, 39, 40, 41, 43, 47, 48, 49, 53, 54, 55, 59, 60, 64, 65, 66, 69, 72, 73, 77, 79, 80, 82, 83, 84, 85, 87, 91, 93, 94, 97, 98] test_data_sorted = [-98, -97, -96, -95, -94, -92, -90, -83, -79, -78, -77, -76, -74, -68, -67, -66, -65, -64, -63, -61, -55, -52, -48, -46, -45, -44, -43, -42, -40, -39, -38, -34, -32, -30, -27, -26, -25, -24, -21, -20, -18, -17, -14, -13, -12, -11, -3, 0, 1, 6, 7, 8, 9, 10, 12, 13, 16, 17, 19, 20, 21, 23, 24, 27, 30, 31, 34, 36, 37, 39, 40, 41, 43, 47, 48, 49, 53, 54, 55, 59, 60, 64, 65, 66, 69, 72, 73, 77, 79, 80, 82, 83, 84, 85, 87, 91, 93, 94, 97, 98]
@@ -49,25 +47,16 @@ def test_basicstats():
def test_regression(): def test_regression():
assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccu, ["lin"])) == True assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccu, ["lin"])) == True
assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccd, ["log"])) == True #assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccd, ["log"])) == True
assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccu, ["exp"])) == True #assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccu, ["exp"])) == True
assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccu, ["ply"])) == True #assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccu, ["ply"])) == True
assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccd, ["sig"])) == True #assert all(isinstance(item, str) for item in an.regression(test_data_linear, y_data_ccd, ["sig"])) == True
def test_metrics(): def test_metrics():
assert an.Metric().elo(1500, 1500, [1, 0], 400, 24) == 1512.0 assert an.Metric().elo(1500, 1500, [1, 0], 400, 24) == 1512.0
assert an.Metric().glicko2(1500, 250, 0.06, [1500, 1400], [250, 240], [1, 0]) == (1478.864307445517, 195.99122679202452, 0.05999602937563585) assert an.Metric().glicko2(1500, 250, 0.06, [1500, 1400], [250, 240], [1, 0]) == (1478.864307445517, 195.99122679202452, 0.05999602937563585)
e = [[(21.346, 7.875), (20.415, 7.808), (29.037, 7.170)], [(28.654, 7.875), (28.654, 7.875), (23.225, 6.287)]] #assert an.Metric().trueskill([[(25, 8.33), (24, 8.25), (32, 7.5)], [(25, 8.33), (25, 8.33), (21, 6.5)]], [1, 0]) == [(metrics.trueskill.Rating(mu=21.346, sigma=7.875), metrics.trueskill.Rating(mu=20.415, sigma=7.808), metrics.trueskill.Rating(mu=29.037, sigma=7.170)), (metrics.trueskill.Rating(mu=28.654, sigma=7.875), metrics.trueskill.Rating(mu=28.654, sigma=7.875), metrics.trueskill.Rating(mu=23.225, sigma=6.287))]
r = an.Metric().trueskill([[(25, 8.33), (24, 8.25), (32, 7.5)], [(25, 8.33), (25, 8.33), (21, 6.5)]], [1, 0])
i = 0
for group in r:
j = 0
for team in group:
assert abs(team.mu - e[i][j][0]) < 0.001
assert abs(team.sigma - e[i][j][1]) < 0.001
j+=1
i+=1
def test_array(): def test_array():
@@ -153,9 +142,14 @@ def test_sort():
assert all(a == b for a, b in zip(sort(test_data_scrambled), test_data_sorted)) assert all(a == b for a, b in zip(sort(test_data_scrambled), test_data_sorted))
def test_statisticaltest(): def test_statisticaltest():
#print(StatisticalTest.tukey_multicomparison([test_data_linear, test_data_linear2, test_data_linear3]))
assert StatisticalTest.tukey_multicomparison([test_data_linear, test_data_linear2, test_data_linear3]) == \ assert StatisticalTest.tukey_multicomparison([test_data_linear, test_data_linear2, test_data_linear3]) == \
{'group 1 and group 2': [0.32571517201527916, False], 'group 1 and group 3': [0.977145516045838, False], 'group 2 and group 3': [0.6514303440305589, False]} {'group 1 and group 2': [0.32571517201527916, False], 'group 1 and group 3': [0.977145516045838, False], 'group 2 and group 3': [0.6514303440305589, False]}
#assert all(np.isclose([i[0] for i in list(StatisticalTest.tukey_multicomparison([test_data_linear, test_data_linear2, test_data_linear3]).values],
# [0.32571517201527916, 0.977145516045838, 0.6514303440305589]))
#assert [i[1] for i in StatisticalTest.tukey_multicomparison([test_data_linear, test_data_linear2, test_data_linear3]).values] == \
# [False, False, False]
def test_svm(): def test_svm():
@@ -236,18 +230,4 @@ def test_equation():
"-(sgn(cos(PI/4)))": -1, "-(sgn(cos(PI/4)))": -1,
} }
for key in list(correctParse.keys()): for key in list(correctParse.keys()):
assert parser.eval(key) == correctParse[key] assert parser.eval(key) == correctParse[key]
def test_clustering():
normalizer = sklearn.preprocessing.Normalizer()
data = X = np.array([[1, 2], [2, 2], [2, 3], [8, 7], [8, 8], [25, 80]])
assert Clustering.dbscan(data, eps=3, min_samples=2).tolist() == [0, 0, 0, 1, 1, -1]
assert Clustering.dbscan(data, normalizer=normalizer, eps=3, min_samples=2).tolist() == [0, 0, 0, 0, 0, 0]
data = np.array([[1, 1], [2, 1], [1, 0], [4, 7], [3, 5], [3, 6]])
assert Clustering.spectral(data, n_clusters=2, assign_labels='discretize', random_state=0).tolist() == [1, 1, 1, 0, 0, 0]
assert Clustering.spectral(data, normalizer=normalizer, n_clusters=2, assign_labels='discretize', random_state=0).tolist() == [0, 1, 1, 0, 0, 0]

View File

@@ -7,19 +7,10 @@
# current benchmark of optimization: 1.33 times faster # current benchmark of optimization: 1.33 times faster
# setup: # setup:
__version__ = "3.0.6" __version__ = "3.0.2"
# changelog should be viewed using print(analysis.__changelog__) # changelog should be viewed using print(analysis.__changelog__)
__changelog__ = """changelog: __changelog__ = """changelog:
3.0.6:
- added docstrings
3.0.5:
- removed extra submodule imports
- fixed/optimized header
3.0.4:
- removed -_obj imports
3.0.3:
- fixed spelling of deprecate
3.0.2: 3.0.2:
- fixed __all__ - fixed __all__
3.0.1: 3.0.1:
@@ -67,7 +58,7 @@ __changelog__ = """changelog:
- cycle sort - cycle sort
- cocktail sort - cocktail sort
- tested all sorting algorithms with both lists and numpy arrays - tested all sorting algorithms with both lists and numpy arrays
- deprecated sort function from Array class - depreciated sort function from Array class
- added warnings as an import - added warnings as an import
2.1.4: 2.1.4:
- added sort and search functions to Array class - added sort and search functions to Array class
@@ -145,7 +136,7 @@ __changelog__ = """changelog:
1.12.4: 1.12.4:
- renamed gliko to glicko - renamed gliko to glicko
1.12.3: 1.12.3:
- removed deprecated code - removed depreciated code
1.12.2: 1.12.2:
- removed team first time trueskill instantiation in favor of integration in superscript.py - removed team first time trueskill instantiation in favor of integration in superscript.py
1.12.1: 1.12.1:
@@ -257,10 +248,10 @@ __changelog__ = """changelog:
1.0.0: 1.0.0:
- removed c_entities,nc_entities,obstacles,objectives from __all__ - removed c_entities,nc_entities,obstacles,objectives from __all__
- applied numba.jit to all functions - applied numba.jit to all functions
- deprecated and removed stdev_z_split - depreciated and removed stdev_z_split
- cleaned up histo_analysis to include numpy and numba.jit optimizations - cleaned up histo_analysis to include numpy and numba.jit optimizations
- deprecated and removed all regression functions in favor of future pytorch optimizer - depreciated and removed all regression functions in favor of future pytorch optimizer
- deprecated and removed all nonessential functions (basic_analysis, benchmark, strip_data) - depreciated and removed all nonessential functions (basic_analysis, benchmark, strip_data)
- optimized z_normalize using sklearn.preprocessing.normalize - optimized z_normalize using sklearn.preprocessing.normalize
- TODO: implement kernel/function based pytorch regression optimizer - TODO: implement kernel/function based pytorch regression optimizer
0.9.0: 0.9.0:
@@ -279,8 +270,8 @@ __changelog__ = """changelog:
- refactors - refactors
- bugfixes - bugfixes
0.8.0: 0.8.0:
- deprecated histo_analysis_old - depreciated histo_analysis_old
- deprecated debug - depreciated debug
- altered basic_analysis to take array data instead of filepath - altered basic_analysis to take array data instead of filepath
- refactor - refactor
- optimization - optimization
@@ -328,7 +319,7 @@ __changelog__ = """changelog:
0.3.5: 0.3.5:
- major bug fixes - major bug fixes
- updated historical analysis - updated historical analysis
- deprecated old historical analysis - depreciated old historical analysis
0.3.4: 0.3.4:
- added __version__, __author__, __all__ - added __version__, __author__, __all__
- added polynomial regression - added polynomial regression
@@ -366,6 +357,7 @@ __all__ = [
'histo_analysis', 'histo_analysis',
'regression', 'regression',
'Metric', 'Metric',
'kmeans',
'pca', 'pca',
'decisiontree', 'decisiontree',
# all statistics functions left out due to integration in other functions # all statistics functions left out due to integration in other functions
@@ -378,39 +370,40 @@ __all__ = [
import csv import csv
from tra_analysis.metrics import elo as Elo from tra_analysis.metrics import elo as Elo
from tra_analysis.metrics import glicko2 as Glicko2 from tra_analysis.metrics import glicko2 as Glicko2
import math
import numpy as np import numpy as np
import scipy import scipy
import sklearn, sklearn.cluster, sklearn.pipeline from scipy import optimize, stats
import sklearn
from sklearn import preprocessing, pipeline, linear_model, metrics, cluster, decomposition, tree, neighbors, naive_bayes, svm, model_selection, ensemble
from tra_analysis.metrics import trueskill as Trueskill from tra_analysis.metrics import trueskill as Trueskill
import warnings
# import submodules # import submodules
from .Array import Array
from .ClassificationMetric import ClassificationMetric from .ClassificationMetric import ClassificationMetric
from .CorrelationTest_obj import CorrelationTest
from .KNN_obj import KNN
from .NaiveBayes_obj import NaiveBayes
from .RandomForest_obj import RandomForest
from .RegressionMetric import RegressionMetric
from .Sort_obj import Sort
from .StatisticalTest_obj import StatisticalTest
from . import SVM
class error(ValueError): class error(ValueError):
pass pass
def load_csv(filepath): def load_csv(filepath):
"""
Loads csv file into 2D numpy array. Does not check csv file validity.
parameters:
filepath: String path to the csv file
return:
2D numpy array of values stored in csv file
"""
with open(filepath, newline='') as csvfile: with open(filepath, newline='') as csvfile:
file_array = np.array(list(csv.reader(csvfile))) file_array = np.array(list(csv.reader(csvfile)))
csvfile.close() csvfile.close()
return file_array return file_array
# expects 1d array
def basic_stats(data): def basic_stats(data):
"""
Calculates mean, median, standard deviation, variance, minimum, maximum of a simple set of elements.
parameters:
data: List representing set of unordered elements
return:
Dictionary with (mean, median, standard-deviation, variance, minimum, maximum) as keys and corresponding values
"""
data_t = np.array(data).astype(float) data_t = np.array(data).astype(float)
_mean = mean(data_t) _mean = mean(data_t)
@@ -422,43 +415,24 @@ def basic_stats(data):
return {"mean": _mean, "median": _median, "standard-deviation": _stdev, "variance": _variance, "minimum": _min, "maximum": _max} return {"mean": _mean, "median": _median, "standard-deviation": _stdev, "variance": _variance, "minimum": _min, "maximum": _max}
# returns z score with inputs of point, mean and standard deviation of spread
def z_score(point, mean, stdev): def z_score(point, mean, stdev):
"""
Calculates z score of a specific point given mean and standard deviation of data.
parameters:
point: Real value corresponding to a single point of data
mean: Real value corresponding to the mean of the dataset
stdev: Real value corresponding to the standard deviation of the dataset
return:
Real value that is the point's z score
"""
score = (point - mean) / stdev score = (point - mean) / stdev
return score return score
# expects 2d array, normalizes across all axes
def z_normalize(array, *args): def z_normalize(array, *args):
"""
Applies sklearn.normalize(array, axis = args) on any arraylike parseable by numpy.
parameters:
array: array like structure of reals aka nested indexables
*args: arguments relating to axis normalized against
return:
numpy array of normalized values from ArrayLike input
"""
array = np.array(array) array = np.array(array)
for arg in args: for arg in args:
array = sklearn.preprocessing.normalize(array, axis = arg) array = sklearn.preprocessing.normalize(array, axis = arg)
return array return array
# expects 2d array of [x,y]
def histo_analysis(hist_data): def histo_analysis(hist_data):
"""
Calculates the mean and standard deviation of derivatives of (x,y) points. Requires at least 2 points to compute.
parameters:
hist_data: list of real coordinate point data (x, y)
return:
Dictionary with (mean, deviation) as keys to corresponding values
"""
if len(hist_data[0]) > 2: if len(hist_data[0]) > 2:
hist_data = np.array(hist_data) hist_data = np.array(hist_data)
@@ -474,15 +448,7 @@ def histo_analysis(hist_data):
return None return None
def regression(inputs, outputs, args): # inputs, outputs expects N-D array def regression(inputs, outputs, args): # inputs, outputs expects N-D array
"""
Applies specified regression kernels onto input, output data pairs.
parameters:
inputs: List of Reals representing independent variable values of each point
outputs: List of Reals representing dependent variable values of each point
args: List of Strings from values (lin, log, exp, ply, sig)
return:
Dictionary with keys (lin, log, exp, ply, sig) as keys to correspondiong regression models
"""
X = np.array(inputs) X = np.array(inputs)
y = np.array(outputs) y = np.array(outputs)
@@ -586,39 +552,13 @@ def regression(inputs, outputs, args): # inputs, outputs expects N-D array
return regressions return regressions
class Metric: class Metric:
"""
The metric class wraps the metrics models. Call without instantiation as Metric.<method>(...)
"""
def elo(self, starting_score, opposing_score, observed, N, K): def elo(self, starting_score, opposing_score, observed, N, K):
"""
Calculates an elo adjusted ELO score given a player's current score, opponent's score, and outcome of match.
reference: https://en.wikipedia.org/wiki/Elo_rating_system
parameters:
starting_score: Real value representing player's ELO score before a match
opposing_score: Real value representing opponent's score before the match
observed: Array of Real values representing multiple sequential match outcomes against the same opponent. 1 for match win, 0.5 for tie, 0 for loss.
N: Real value representing the normal or mean score expected (usually 1200)
K: R eal value representing a system constant, determines how quickly players will change scores (usually 24)
return:
Real value representing the player's new ELO score
"""
return Elo.calculate(starting_score, opposing_score, observed, N, K) return Elo.calculate(starting_score, opposing_score, observed, N, K)
def glicko2(self, starting_score, starting_rd, starting_vol, opposing_score, opposing_rd, observations): def glicko2(self, starting_score, starting_rd, starting_vol, opposing_score, opposing_rd, observations):
"""
Calculates an adjusted Glicko-2 score given a player's current score, multiple opponent's score, and outcome of several matches.
reference: http://www.glicko.net/glicko/glicko2.pdf
parameters:
starting_score: Real value representing the player's Glicko-2 score
starting_rd: Real value representing the player's RD
starting_vol: Real value representing the player's volatility
opposing_score: List of Real values representing multiple opponent's Glicko-2 scores
opposing_rd: List of Real values representing multiple opponent's RD
opposing_vol: List of Real values representing multiple opponent's volatility
observations: List of Real values representing the outcome of several matches, where each match's opponent corresponds with the opposing_score, opposing_rd, opposing_vol values of the same indesx. Outcomes can be a score, presuming greater score is better.
return:
Tuple of 3 Real values representing the player's new score, rd, and vol
"""
player = Glicko2.Glicko2(rating = starting_score, rd = starting_rd, vol = starting_vol) player = Glicko2.Glicko2(rating = starting_score, rd = starting_rd, vol = starting_vol)
player.update_player([x for x in opposing_score], [x for x in opposing_rd], observations) player.update_player([x for x in opposing_score], [x for x in opposing_rd], observations)
@@ -626,15 +566,7 @@ class Metric:
return (player.rating, player.rd, player.vol) return (player.rating, player.rd, player.vol)
def trueskill(self, teams_data, observations): # teams_data is array of array of tuples ie. [[(mu, sigma), (mu, sigma), (mu, sigma)], [(mu, sigma), (mu, sigma), (mu, sigma)]] def trueskill(self, teams_data, observations): # teams_data is array of array of tuples ie. [[(mu, sigma), (mu, sigma), (mu, sigma)], [(mu, sigma), (mu, sigma), (mu, sigma)]]
"""
Calculates the score changes for multiple teams playing in a single match accoding to the trueskill algorithm.
reference: https://trueskill.org/
parameters:
teams_data: List of List of Tuples of 2 Real values representing multiple player ratings. List of teams, which is a List of players. Each player rating is a Tuple of 2 Real values (mu, sigma).
observations: List of Real values representing the match outcome. Each value in the List is the score corresponding to the team at the same index in teams_data.
return:
List of List of Tuples of 2 Real values representing new player ratings. Same structure as teams_data.
"""
team_ratings = [] team_ratings = []
for team in teams_data: for team in teams_data:
@@ -670,31 +602,23 @@ def npmax(data):
return np.amax(data) return np.amax(data)
def kmeans(data, n_clusters=8, init="k-means++", n_init=10, max_iter=300, tol=0.0001, precompute_distances="auto", verbose=0, random_state=None, copy_x=True, n_jobs=None, algorithm="auto"):
kernel = sklearn.cluster.KMeans(n_clusters = n_clusters, init = init, n_init = n_init, max_iter = max_iter, tol = tol, precompute_distances = precompute_distances, verbose = verbose, random_state = random_state, copy_x = copy_x, n_jobs = n_jobs, algorithm = algorithm)
kernel.fit(data)
predictions = kernel.predict(data)
centers = kernel.cluster_centers_
return centers, predictions
def pca(data, n_components = None, copy = True, whiten = False, svd_solver = "auto", tol = 0.0, iterated_power = "auto", random_state = None): def pca(data, n_components = None, copy = True, whiten = False, svd_solver = "auto", tol = 0.0, iterated_power = "auto", random_state = None):
"""
Performs a principle component analysis on the input data.
reference: https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.PCA.html
parameters:
data: Arraylike of Reals representing the set of data to perform PCA on
* : refer to reference for usage, parameters follow same usage
return:
Arraylike of Reals representing the set of data that has had PCA performed. The dimensionality of the Arraylike may be smaller or equal.
"""
kernel = sklearn.decomposition.PCA(n_components = n_components, copy = copy, whiten = whiten, svd_solver = svd_solver, tol = tol, iterated_power = iterated_power, random_state = random_state) kernel = sklearn.decomposition.PCA(n_components = n_components, copy = copy, whiten = whiten, svd_solver = svd_solver, tol = tol, iterated_power = iterated_power, random_state = random_state)
return kernel.fit_transform(data) return kernel.fit_transform(data)
def decisiontree(data, labels, test_size = 0.3, criterion = "gini", splitter = "default", max_depth = None): #expects *2d data and 1d labels def decisiontree(data, labels, test_size = 0.3, criterion = "gini", splitter = "default", max_depth = None): #expects *2d data and 1d labels
"""
Generates a decision tree classifier fitted to the given data.
reference: https://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html
parameters:
data: List of values representing each data point of multiple axes
labels: List of values represeing the labels corresponding to the same index at data
* : refer to reference for usage, parameters follow same usage
return:
DecisionTreeClassifier model and corresponding classification accuracy metrics
"""
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1) data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.tree.DecisionTreeClassifier(criterion = criterion, splitter = splitter, max_depth = max_depth) model = sklearn.tree.DecisionTreeClassifier(criterion = criterion, splitter = splitter, max_depth = max_depth)
model = model.fit(data_train,labels_train) model = model.fit(data_train,labels_train)

View File

@@ -4,11 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import Array' # this should be imported as a python module using 'from tra_analysis import Array'
# setup: # setup:
__version__ = "1.0.4" __version__ = "1.0.3"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.4:
- fixed spelling of deprecate
1.0.3: 1.0.3:
- fixed __all__ - fixed __all__
1.0.2: 1.0.2:
@@ -137,8 +135,8 @@ class Array(): # tests on nd arrays independent of basic_stats
return Array(np.transpose(self.array)) return Array(np.transpose(self.array))
def sort(self, array): # deprecated def sort(self, array): # depreciated
warnings.warn("Array.sort has been deprecated in favor of Sort") warnings.warn("Array.sort has been depreciated in favor of Sort")
array_length = len(array) array_length = len(array)
if array_length <= 1: if array_length <= 1:
return array return array

View File

@@ -4,11 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import ClassificationMetric' # this should be imported as a python module using 'from tra_analysis import ClassificationMetric'
# setup: # setup:
__version__ = "1.0.2" __version__ = "1.0.1"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.2:
- optimized imports
1.0.1: 1.0.1:
- fixed __all__ - fixed __all__
1.0.0: 1.0.0:
@@ -24,6 +22,7 @@ __all__ = [
] ]
import sklearn import sklearn
from sklearn import metrics
class ClassificationMetric(): class ClassificationMetric():

View File

@@ -1,63 +0,0 @@
# Titan Robotics Team 2022: Clustering submodule
# Written by Arthur Lu
# Notes:
# this should be imported as a python module using 'from tra_analysis import Clustering'
# setup:
__version__ = "2.0.2"
# changelog should be viewed using print(analysis.__changelog__)
__changelog__ = """changelog:
2.0.2:
- generalized optional args to **kwargs
2.0.1:
- added normalization preprocessing to clustering, expects instance of sklearn.preprocessing.Normalizer()
2.0.0:
- added dbscan clustering algo
- added spectral clustering algo
1.0.0:
- created this submodule
- copied kmeans clustering from Analysis
"""
__author__ = (
"Arthur Lu <learthurgo@gmail.com>",
)
__all__ = [
"kmeans",
"dbscan",
"spectral",
]
import sklearn
def kmeans(data, normalizer = None, **kwargs):
if normalizer != None:
data = normalizer.transform(data)
kernel = sklearn.cluster.KMeans(**kwargs)
kernel.fit(data)
predictions = kernel.predict(data)
centers = kernel.cluster_centers_
return centers, predictions
def dbscan(data, normalizer=None, **kwargs):
if normalizer != None:
data = normalizer.transform(data)
model = sklearn.cluster.DBSCAN(**kwargs).fit(data)
return model.labels_
def spectral(data, normalizer=None, **kwargs):
if normalizer != None:
data = normalizer.transform(data)
model = sklearn.cluster.SpectralClustering(**kwargs).fit(data)
return model.labels_

View File

@@ -4,13 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import CorrelationTest' # this should be imported as a python module using 'from tra_analysis import CorrelationTest'
# setup: # setup:
__version__ = "1.0.3" __version__ = "1.0.1"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.3:
- generalized optional args to **kwargs
1.0.2:
- optimized imports
1.0.1: 1.0.1:
- fixed __all__ - fixed __all__
1.0.0: 1.0.0:
@@ -33,6 +29,7 @@ __all__ = [
] ]
import scipy import scipy
from scipy import stats
def anova_oneway(*args): #expects arrays of samples def anova_oneway(*args): #expects arrays of samples
@@ -44,9 +41,9 @@ def pearson(x, y):
results = scipy.stats.pearsonr(x, y) results = scipy.stats.pearsonr(x, y)
return {"r-value": results[0], "p-value": results[1]} return {"r-value": results[0], "p-value": results[1]}
def spearman(a, b = None, **kwargs): def spearman(a, b = None, axis = 0, nan_policy = 'propagate'):
results = scipy.stats.spearmanr(a, b = b, **kwargs) results = scipy.stats.spearmanr(a, b = b, axis = axis, nan_policy = nan_policy)
return {"r-value": results[0], "p-value": results[1]} return {"r-value": results[0], "p-value": results[1]}
def point_biserial(x, y): def point_biserial(x, y):
@@ -54,17 +51,17 @@ def point_biserial(x, y):
results = scipy.stats.pointbiserialr(x, y) results = scipy.stats.pointbiserialr(x, y)
return {"r-value": results[0], "p-value": results[1]} return {"r-value": results[0], "p-value": results[1]}
def kendall(x, y, **kwargs): def kendall(x, y, initial_lexsort = None, nan_policy = 'propagate', method = 'auto'):
results = scipy.stats.kendalltau(x, y, **kwargs) results = scipy.stats.kendalltau(x, y, initial_lexsort = initial_lexsort, nan_policy = nan_policy, method = method)
return {"tau": results[0], "p-value": results[1]} return {"tau": results[0], "p-value": results[1]}
def kendall_weighted(x, y, **kwargs): def kendall_weighted(x, y, rank = True, weigher = None, additive = True):
results = scipy.stats.weightedtau(x, y, **kwargs) results = scipy.stats.weightedtau(x, y, rank = rank, weigher = weigher, additive = additive)
return {"tau": results[0], "p-value": results[1]} return {"tau": results[0], "p-value": results[1]}
def mgc(x, y, **kwargs): def mgc(x, y, compute_distance = None, reps = 1000, workers = 1, is_twosamp = False, random_state = None):
results = scipy.stats.multiscale_graphcorr(x, y, **kwargs) results = scipy.stats.multiscale_graphcorr(x, y, compute_distance = compute_distance, reps = reps, workers = workers, is_twosamp = is_twosamp, random_state = random_state)
return {"k-value": results[0], "p-value": results[1], "data": results[2]} # unsure if MGC test returns a k value return {"k-value": results[0], "p-value": results[1], "data": results[2]} # unsure if MGC test returns a k value

View File

@@ -0,0 +1,41 @@
# Only included for backwards compatibility! Do not update, CorrelationTest is preferred and supported.
import scipy
from scipy import stats
class CorrelationTest:
def anova_oneway(self, *args): #expects arrays of samples
results = scipy.stats.f_oneway(*args)
return {"f-value": results[0], "p-value": results[1]}
def pearson(self, x, y):
results = scipy.stats.pearsonr(x, y)
return {"r-value": results[0], "p-value": results[1]}
def spearman(self, a, b = None, axis = 0, nan_policy = 'propagate'):
results = scipy.stats.spearmanr(a, b = b, axis = axis, nan_policy = nan_policy)
return {"r-value": results[0], "p-value": results[1]}
def point_biserial(self, x,y):
results = scipy.stats.pointbiserialr(x, y)
return {"r-value": results[0], "p-value": results[1]}
def kendall(self, x, y, initial_lexsort = None, nan_policy = 'propagate', method = 'auto'):
results = scipy.stats.kendalltau(x, y, initial_lexsort = initial_lexsort, nan_policy = nan_policy, method = method)
return {"tau": results[0], "p-value": results[1]}
def kendall_weighted(self, x, y, rank = True, weigher = None, additive = True):
results = scipy.stats.weightedtau(x, y, rank = rank, weigher = weigher, additive = additive)
return {"tau": results[0], "p-value": results[1]}
def mgc(self, x, y, compute_distance = None, reps = 1000, workers = 1, is_twosamp = False, random_state = None):
results = scipy.stats.multiscale_graphcorr(x, y, compute_distance = compute_distance, reps = reps, workers = workers, is_twosamp = is_twosamp, random_state = random_state)
return {"k-value": results[0], "p-value": results[1], "data": results[2]} # unsure if MGC test returns a k value

View File

@@ -4,13 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import KNN' # this should be imported as a python module using 'from tra_analysis import KNN'
# setup: # setup:
__version__ = "1.0.2" __version__ = "1.0.0"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.2:
- generalized optional args to **kwargs
1.0.1:
- optimized imports
1.0.0: 1.0.0:
- ported analysis.KNN() here - ported analysis.KNN() here
- removed classness - removed classness
@@ -27,21 +23,22 @@ __all__ = [
] ]
import sklearn import sklearn
from sklearn import model_selection, neighbors
from . import ClassificationMetric, RegressionMetric from . import ClassificationMetric, RegressionMetric
def knn_classifier(data, labels, n_neighbors = 5, test_size = 0.3, **kwargs): #expects *2d data and 1d labels post-scaling def knn_classifier(data, labels, n_neighbors = 5, test_size = 0.3, algorithm='auto', leaf_size=30, metric='minkowski', metric_params=None, n_jobs=None, p=2, weights='uniform'): #expects *2d data and 1d labels post-scaling
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1) data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.neighbors.KNeighborsClassifier(n_neighbors = n_neighbors, **kwargs) model = sklearn.neighbors.KNeighborsClassifier(n_neighbors = n_neighbors, weights = weights, algorithm = algorithm, leaf_size = leaf_size, p = p, metric = metric, metric_params = metric_params, n_jobs = n_jobs)
model.fit(data_train, labels_train) model.fit(data_train, labels_train)
predictions = model.predict(data_test) predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test) return model, ClassificationMetric(predictions, labels_test)
def knn_regressor(data, outputs, n_neighbors = 5, test_size = 0.3, **kwargs): def knn_regressor(data, outputs, n_neighbors = 5, test_size = 0.3, weights = "uniform", algorithm = "auto", leaf_size = 30, p = 2, metric = "minkowski", metric_params = None, n_jobs = None):
data_train, data_test, outputs_train, outputs_test = sklearn.model_selection.train_test_split(data, outputs, test_size=test_size, random_state=1) data_train, data_test, outputs_train, outputs_test = sklearn.model_selection.train_test_split(data, outputs, test_size=test_size, random_state=1)
model = sklearn.neighbors.KNeighborsRegressor(n_neighbors = n_neighbors, **kwargs) model = sklearn.neighbors.KNeighborsRegressor(n_neighbors = n_neighbors, weights = weights, algorithm = algorithm, leaf_size = leaf_size, p = p, metric = metric, metric_params = metric_params, n_jobs = n_jobs)
model.fit(data_train, outputs_train) model.fit(data_train, outputs_train)
predictions = model.predict(data_test) predictions = model.predict(data_test)

View File

@@ -0,0 +1,25 @@
# Only included for backwards compatibility! Do not update, NaiveBayes is preferred and supported.
import sklearn
from sklearn import model_selection, neighbors
from . import ClassificationMetric, RegressionMetric
class KNN:
def knn_classifier(self, data, labels, n_neighbors, test_size = 0.3, algorithm='auto', leaf_size=30, metric='minkowski', metric_params=None, n_jobs=None, p=2, weights='uniform'): #expects *2d data and 1d labels post-scaling
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.neighbors.KNeighborsClassifier()
model.fit(data_train, labels_train)
predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test)
def knn_regressor(self, data, outputs, n_neighbors, test_size = 0.3, weights = "uniform", algorithm = "auto", leaf_size = 30, p = 2, metric = "minkowski", metric_params = None, n_jobs = None):
data_train, data_test, outputs_train, outputs_test = sklearn.model_selection.train_test_split(data, outputs, test_size=test_size, random_state=1)
model = sklearn.neighbors.KNeighborsRegressor(n_neighbors = n_neighbors, weights = weights, algorithm = algorithm, leaf_size = leaf_size, p = p, metric = metric, metric_params = metric_params, n_jobs = n_jobs)
model.fit(data_train, outputs_train)
predictions = model.predict(data_test)
return model, RegressionMetric(predictions, outputs_test)

View File

@@ -4,13 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import NaiveBayes' # this should be imported as a python module using 'from tra_analysis import NaiveBayes'
# setup: # setup:
__version__ = "1.0.2" __version__ = "1.0.0"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.2:
- generalized optional args to **kwargs
1.0.1:
- optimized imports
1.0.0: 1.0.0:
- ported analysis.NaiveBayes() here - ported analysis.NaiveBayes() here
- removed classness - removed classness
@@ -22,45 +18,46 @@ __author__ = (
__all__ = [ __all__ = [
'gaussian', 'gaussian',
'multinomial', 'multinomial'
'bernoulli', 'bernoulli',
'complement', 'complement'
] ]
import sklearn import sklearn
from . import ClassificationMetric from sklearn import model_selection, naive_bayes
from . import ClassificationMetric, RegressionMetric
def gaussian(data, labels, test_size = 0.3, **kwargs): def gaussian(data, labels, test_size = 0.3, priors = None, var_smoothing = 1e-09):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1) data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.GaussianNB(**kwargs) model = sklearn.naive_bayes.GaussianNB(priors = priors, var_smoothing = var_smoothing)
model.fit(data_train, labels_train) model.fit(data_train, labels_train)
predictions = model.predict(data_test) predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test) return model, ClassificationMetric(predictions, labels_test)
def multinomial(data, labels, test_size = 0.3, **kwargs): def multinomial(data, labels, test_size = 0.3, alpha=1.0, fit_prior=True, class_prior=None):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1) data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.MultinomialNB(**kwargs) model = sklearn.naive_bayes.MultinomialNB(alpha = alpha, fit_prior = fit_prior, class_prior = class_prior)
model.fit(data_train, labels_train) model.fit(data_train, labels_train)
predictions = model.predict(data_test) predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test) return model, ClassificationMetric(predictions, labels_test)
def bernoulli(data, labels, test_size = 0.3, **kwargs): def bernoulli(data, labels, test_size = 0.3, alpha=1.0, binarize=0.0, fit_prior=True, class_prior=None):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1) data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.BernoulliNB(**kwargs) model = sklearn.naive_bayes.BernoulliNB(alpha = alpha, binarize = binarize, fit_prior = fit_prior, class_prior = class_prior)
model.fit(data_train, labels_train) model.fit(data_train, labels_train)
predictions = model.predict(data_test) predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test) return model, ClassificationMetric(predictions, labels_test)
def complement(data, labels, test_size = 0.3, **kwargs): def complement(data, labels, test_size = 0.3, alpha=1.0, fit_prior=True, class_prior=None, norm=False):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1) data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.ComplementNB(**kwargs) model = sklearn.naive_bayes.ComplementNB(alpha = alpha, fit_prior = fit_prior, class_prior = class_prior, norm = norm)
model.fit(data_train, labels_train) model.fit(data_train, labels_train)
predictions = model.predict(data_test) predictions = model.predict(data_test)

View File

@@ -0,0 +1,43 @@
# Only included for backwards compatibility! Do not update, NaiveBayes is preferred and supported.
import sklearn
from sklearn import model_selection, naive_bayes
from . import ClassificationMetric, RegressionMetric
class NaiveBayes:
def guassian(self, data, labels, test_size = 0.3, priors = None, var_smoothing = 1e-09):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.GaussianNB(priors = priors, var_smoothing = var_smoothing)
model.fit(data_train, labels_train)
predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test)
def multinomial(self, data, labels, test_size = 0.3, alpha=1.0, fit_prior=True, class_prior=None):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.MultinomialNB(alpha = alpha, fit_prior = fit_prior, class_prior = class_prior)
model.fit(data_train, labels_train)
predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test)
def bernoulli(self, data, labels, test_size = 0.3, alpha=1.0, binarize=0.0, fit_prior=True, class_prior=None):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.BernoulliNB(alpha = alpha, binarize = binarize, fit_prior = fit_prior, class_prior = class_prior)
model.fit(data_train, labels_train)
predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test)
def complement(self, data, labels, test_size = 0.3, alpha=1.0, fit_prior=True, class_prior=None, norm=False):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
model = sklearn.naive_bayes.ComplementNB(alpha = alpha, fit_prior = fit_prior, class_prior = class_prior, norm = norm)
model.fit(data_train, labels_train)
predictions = model.predict(data_test)
return model, ClassificationMetric(predictions, labels_test)

View File

@@ -4,14 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import RandomForest' # this should be imported as a python module using 'from tra_analysis import RandomForest'
# setup: # setup:
__version__ = "1.0.3" __version__ = "1.0.1"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.3:
- updated RandomForestClassifier and RandomForestRegressor parameters to match sklearn v 1.0.2
- changed default values for kwargs to rely on sklearn
1.0.2:
- optimized imports
1.0.1: 1.0.1:
- fixed __all__ - fixed __all__
1.0.0: 1.0.0:
@@ -28,22 +23,23 @@ __all__ = [
"random_forest_regressor", "random_forest_regressor",
] ]
import sklearn, sklearn.ensemble, sklearn.naive_bayes import sklearn
from sklearn import ensemble, model_selection
from . import ClassificationMetric, RegressionMetric from . import ClassificationMetric, RegressionMetric
def random_forest_classifier(data, labels, test_size, n_estimators, **kwargs): def random_forest_classifier(data, labels, test_size, n_estimators, criterion="gini", max_depth=None, min_samples_split=2, min_samples_leaf=1, min_weight_fraction_leaf=0.0, max_features="auto", max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, bootstrap=True, oob_score=False, n_jobs=None, random_state=None, verbose=0, warm_start=False, class_weight=None):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1) data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
kernel = sklearn.ensemble.RandomForestClassifier(n_estimators = n_estimators, **kwargs) kernel = sklearn.ensemble.RandomForestClassifier(n_estimators = n_estimators, criterion = criterion, max_depth = max_depth, min_samples_split = min_samples_split, min_samples_leaf = min_samples_leaf, min_weight_fraction_leaf = min_weight_fraction_leaf, max_leaf_nodes = max_leaf_nodes, min_impurity_decrease = min_impurity_decrease, bootstrap = bootstrap, oob_score = oob_score, n_jobs = n_jobs, random_state = random_state, verbose = verbose, warm_start = warm_start, class_weight = class_weight)
kernel.fit(data_train, labels_train) kernel.fit(data_train, labels_train)
predictions = kernel.predict(data_test) predictions = kernel.predict(data_test)
return kernel, ClassificationMetric(predictions, labels_test) return kernel, ClassificationMetric(predictions, labels_test)
def random_forest_regressor(data, outputs, test_size, n_estimators, **kwargs): def random_forest_regressor(data, outputs, test_size, n_estimators, criterion="mse", max_depth=None, min_samples_split=2, min_samples_leaf=1, min_weight_fraction_leaf=0.0, max_features="auto", max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, bootstrap=True, oob_score=False, n_jobs=None, random_state=None, verbose=0, warm_start=False):
data_train, data_test, outputs_train, outputs_test = sklearn.model_selection.train_test_split(data, outputs, test_size=test_size, random_state=1) data_train, data_test, outputs_train, outputs_test = sklearn.model_selection.train_test_split(data, outputs, test_size=test_size, random_state=1)
kernel = sklearn.ensemble.RandomForestRegressor(n_estimators = n_estimators, **kwargs) kernel = sklearn.ensemble.RandomForestRegressor(n_estimators = n_estimators, criterion = criterion, max_depth = max_depth, min_samples_split = min_samples_split, min_weight_fraction_leaf = min_weight_fraction_leaf, max_features = max_features, max_leaf_nodes = max_leaf_nodes, min_impurity_decrease = min_impurity_decrease, min_impurity_split = min_impurity_split, bootstrap = bootstrap, oob_score = oob_score, n_jobs = n_jobs, random_state = random_state, verbose = verbose, warm_start = warm_start)
kernel.fit(data_train, outputs_train) kernel.fit(data_train, outputs_train)
predictions = kernel.predict(data_test) predictions = kernel.predict(data_test)

View File

@@ -0,0 +1,25 @@
# Only included for backwards compatibility! Do not update, RandomForest is preferred and supported.
import sklearn
from sklearn import ensemble, model_selection
from . import ClassificationMetric, RegressionMetric
class RandomForest:
def random_forest_classifier(self, data, labels, test_size, n_estimators, criterion="gini", max_depth=None, min_samples_split=2, min_samples_leaf=1, min_weight_fraction_leaf=0.0, max_features="auto", max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, bootstrap=True, oob_score=False, n_jobs=None, random_state=None, verbose=0, warm_start=False, class_weight=None):
data_train, data_test, labels_train, labels_test = sklearn.model_selection.train_test_split(data, labels, test_size=test_size, random_state=1)
kernel = sklearn.ensemble.RandomForestClassifier(n_estimators = n_estimators, criterion = criterion, max_depth = max_depth, min_samples_split = min_samples_split, min_samples_leaf = min_samples_leaf, min_weight_fraction_leaf = min_weight_fraction_leaf, max_leaf_nodes = max_leaf_nodes, min_impurity_decrease = min_impurity_decrease, bootstrap = bootstrap, oob_score = oob_score, n_jobs = n_jobs, random_state = random_state, verbose = verbose, warm_start = warm_start, class_weight = class_weight)
kernel.fit(data_train, labels_train)
predictions = kernel.predict(data_test)
return kernel, ClassificationMetric(predictions, labels_test)
def random_forest_regressor(self, data, outputs, test_size, n_estimators, criterion="mse", max_depth=None, min_samples_split=2, min_samples_leaf=1, min_weight_fraction_leaf=0.0, max_features="auto", max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, bootstrap=True, oob_score=False, n_jobs=None, random_state=None, verbose=0, warm_start=False):
data_train, data_test, outputs_train, outputs_test = sklearn.model_selection.train_test_split(data, outputs, test_size=test_size, random_state=1)
kernel = sklearn.ensemble.RandomForestRegressor(n_estimators = n_estimators, criterion = criterion, max_depth = max_depth, min_samples_split = min_samples_split, min_weight_fraction_leaf = min_weight_fraction_leaf, max_features = max_features, max_leaf_nodes = max_leaf_nodes, min_impurity_decrease = min_impurity_decrease, min_impurity_split = min_impurity_split, bootstrap = bootstrap, oob_score = oob_score, n_jobs = n_jobs, random_state = random_state, verbose = verbose, warm_start = warm_start)
kernel.fit(data_train, outputs_train)
predictions = kernel.predict(data_test)
return kernel, RegressionMetric(predictions, outputs_test)

View File

@@ -4,11 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import RegressionMetric' # this should be imported as a python module using 'from tra_analysis import RegressionMetric'
# setup: # setup:
__version__ = "1.0.1" __version__ = "1.0.0"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.1:
- optimized imports
1.0.0: 1.0.0:
- ported analysis.RegressionMetric() here - ported analysis.RegressionMetric() here
""" """
@@ -23,6 +21,7 @@ __all__ = [
import numpy as np import numpy as np
import sklearn import sklearn
from sklearn import metrics
class RegressionMetric(): class RegressionMetric():

View File

@@ -4,11 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import SVM' # this should be imported as a python module using 'from tra_analysis import SVM'
# setup: # setup:
__version__ = "1.0.3" __version__ = "1.0.2"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.3:
- optimized imports
1.0.2: 1.0.2:
- fixed __all__ - fixed __all__
1.0.1: 1.0.1:
@@ -32,6 +30,7 @@ __all__ = [
] ]
import sklearn import sklearn
from sklearn import svm
from . import ClassificationMetric, RegressionMetric from . import ClassificationMetric, RegressionMetric
class CustomKernel: class CustomKernel:

View File

@@ -16,7 +16,7 @@ __changelog__ = """changelog:
__author__ = ( __author__ = (
"Arthur Lu <learthurgo@gmail.com>", "Arthur Lu <learthurgo@gmail.com>",
"James Pan <zpan@imsa.edu>", "James Pan <zpan@imsa.edu>"
) )
__all__ = [ __all__ = [

View File

@@ -0,0 +1,391 @@
# Only included for backwards compatibility! Do not update, Sort is preferred and supported.
class Sort: # if you haven't used a sort, then you've never lived
def quicksort(self, a):
def sort(array):
less = []
equal = []
greater = []
if len(array) > 1:
pivot = array[0]
for x in array:
if x < pivot:
less.append(x)
elif x == pivot:
equal.append(x)
elif x > pivot:
greater.append(x)
return sort(less)+equal+sort(greater)
else:
return array
return np.array(sort(a))
def mergesort(self, a):
def sort(array):
array = array
if len(array) >1:
middle = len(array) // 2
L = array[:middle]
R = array[middle:]
sort(L)
sort(R)
i = j = k = 0
while i < len(L) and j < len(R):
if L[i] < R[j]:
array[k] = L[i]
i+= 1
else:
array[k] = R[j]
j+= 1
k+= 1
while i < len(L):
array[k] = L[i]
i+= 1
k+= 1
while j < len(R):
array[k] = R[j]
j+= 1
k+= 1
return array
return sort(a)
def introsort(self, a):
def sort(array, start, end, maxdepth):
array = array
if end - start <= 1:
return
elif maxdepth == 0:
heapsort(array, start, end)
else:
p = partition(array, start, end)
sort(array, start, p + 1, maxdepth - 1)
sort(array, p + 1, end, maxdepth - 1)
return array
def partition(array, start, end):
pivot = array[start]
i = start - 1
j = end
while True:
i = i + 1
while array[i] < pivot:
i = i + 1
j = j - 1
while array[j] > pivot:
j = j - 1
if i >= j:
return j
swap(array, i, j)
def swap(array, i, j):
array[i], array[j] = array[j], array[i]
def heapsort(array, start, end):
build_max_heap(array, start, end)
for i in range(end - 1, start, -1):
swap(array, start, i)
max_heapify(array, index=0, start=start, end=i)
def build_max_heap(array, start, end):
def parent(i):
return (i - 1)//2
length = end - start
index = parent(length - 1)
while index >= 0:
max_heapify(array, index, start, end)
index = index - 1
def max_heapify(array, index, start, end):
def left(i):
return 2*i + 1
def right(i):
return 2*i + 2
size = end - start
l = left(index)
r = right(index)
if (l < size and array[start + l] > array[start + index]):
largest = l
else:
largest = index
if (r < size and array[start + r] > array[start + largest]):
largest = r
if largest != index:
swap(array, start + largest, start + index)
max_heapify(array, largest, start, end)
maxdepth = (len(a).bit_length() - 1)*2
return sort(a, 0, len(a), maxdepth)
def heapsort(self, a):
def sort(array):
array = array
n = len(array)
for i in range(n//2 - 1, -1, -1):
heapify(array, n, i)
for i in range(n-1, 0, -1):
array[i], array[0] = array[0], array[i]
heapify(array, i, 0)
return array
def heapify(array, n, i):
array = array
largest = i
l = 2 * i + 1
r = 2 * i + 2
if l < n and array[i] < array[l]:
largest = l
if r < n and array[largest] < array[r]:
largest = r
if largest != i:
array[i],array[largest] = array[largest],array[i]
heapify(array, n, largest)
return array
return sort(a)
def insertionsort(self, a):
def sort(array):
array = array
for i in range(1, len(array)):
key = array[i]
j = i-1
while j >=0 and key < array[j] :
array[j+1] = array[j]
j -= 1
array[j+1] = key
return array
return sort(a)
def timsort(self, a, block = 32):
BLOCK = block
def sort(array, n):
array = array
for i in range(0, n, BLOCK):
insertionsort(array, i, min((i+31), (n-1)))
size = BLOCK
while size < n:
for left in range(0, n, 2*size):
mid = left + size - 1
right = min((left + 2*size - 1), (n-1))
merge(array, left, mid, right)
size = 2*size
return array
def insertionsort(array, left, right):
array = array
for i in range(left + 1, right+1):
temp = array[i]
j = i - 1
while j >= left and array[j] > temp :
array[j+1] = array[j]
j -= 1
array[j+1] = temp
return array
def merge(array, l, m, r):
len1, len2 = m - l + 1, r - m
left, right = [], []
for i in range(0, len1):
left.append(array[l + i])
for i in range(0, len2):
right.append(array[m + 1 + i])
i, j, k = 0, 0, l
while i < len1 and j < len2:
if left[i] <= right[j]:
array[k] = left[i]
i += 1
else:
array[k] = right[j]
j += 1
k += 1
while i < len1:
array[k] = left[i]
k += 1
i += 1
while j < len2:
array[k] = right[j]
k += 1
j += 1
return sort(a, len(a))
def selectionsort(self, a):
array = a
for i in range(len(array)):
min_idx = i
for j in range(i+1, len(array)):
if array[min_idx] > array[j]:
min_idx = j
array[i], array[min_idx] = array[min_idx], array[i]
return array
def shellsort(self, a):
array = a
n = len(array)
gap = n//2
while gap > 0:
for i in range(gap,n):
temp = array[i]
j = i
while j >= gap and array[j-gap] >temp:
array[j] = array[j-gap]
j -= gap
array[j] = temp
gap //= 2
return array
def bubblesort(self, a):
def sort(array):
for i, num in enumerate(array):
try:
if array[i+1] < num:
array[i] = array[i+1]
array[i+1] = num
sort(array)
except IndexError:
pass
return array
return sort(a)
def cyclesort(self, a):
def sort(array):
array = array
writes = 0
for cycleStart in range(0, len(array) - 1):
item = array[cycleStart]
pos = cycleStart
for i in range(cycleStart + 1, len(array)):
if array[i] < item:
pos += 1
if pos == cycleStart:
continue
while item == array[pos]:
pos += 1
array[pos], item = item, array[pos]
writes += 1
while pos != cycleStart:
pos = cycleStart
for i in range(cycleStart + 1, len(array)):
if array[i] < item:
pos += 1
while item == array[pos]:
pos += 1
array[pos], item = item, array[pos]
writes += 1
return array
return sort(a)
def cocktailsort(self, a):
def sort(array):
array = array
n = len(array)
swapped = True
start = 0
end = n-1
while (swapped == True):
swapped = False
for i in range (start, end):
if (array[i] > array[i + 1]) :
array[i], array[i + 1]= array[i + 1], array[i]
swapped = True
if (swapped == False):
break
swapped = False
end = end-1
for i in range(end-1, start-1, -1):
if (array[i] > array[i + 1]):
array[i], array[i + 1] = array[i + 1], array[i]
swapped = True
start = start + 1
return array
return sort(a)

View File

@@ -4,11 +4,9 @@
# this should be imported as a python module using 'from tra_analysis import StatisticalTest' # this should be imported as a python module using 'from tra_analysis import StatisticalTest'
# setup: # setup:
__version__ = "1.0.3" __version__ = "1.0.2"
__changelog__ = """changelog: __changelog__ = """changelog:
1.0.3:
- optimized imports
1.0.2: 1.0.2:
- added tukey_multicomparison - added tukey_multicomparison
- fixed styling - fixed styling
@@ -63,6 +61,7 @@ __all__ = [
import numpy as np import numpy as np
import scipy import scipy
from scipy import stats, interpolate
def ttest_onesample(a, popmean, axis = 0, nan_policy = 'propagate'): def ttest_onesample(a, popmean, axis = 0, nan_policy = 'propagate'):
@@ -280,9 +279,9 @@ def get_tukeyQcrit(k, df, alpha=0.05):
cv001 = c[:, 2::2] cv001 = c[:, 2::2]
if alpha == 0.05: if alpha == 0.05:
intp = scipy.interpolate.interp1d(crows, cv005[:,k-2]) intp = interpolate.interp1d(crows, cv005[:,k-2])
elif alpha == 0.01: elif alpha == 0.01:
intp = scipy.interpolate.interp1d(crows, cv001[:,k-2]) intp = interpolate.interp1d(crows, cv001[:,k-2])
else: else:
raise ValueError('only implemented for alpha equal to 0.01 and 0.05') raise ValueError('only implemented for alpha equal to 0.01 and 0.05')
return intp(df) return intp(df)

View File

@@ -0,0 +1,170 @@
# Only included for backwards compatibility! Do not update, StatisticalTest is preferred and supported.
import scipy
from scipy import stats
class StatisticalTest:
def ttest_onesample(self, a, popmean, axis = 0, nan_policy = 'propagate'):
results = scipy.stats.ttest_1samp(a, popmean, axis = axis, nan_policy = nan_policy)
return {"t-value": results[0], "p-value": results[1]}
def ttest_independent(self, a, b, equal = True, nan_policy = 'propagate'):
results = scipy.stats.ttest_ind(a, b, equal_var = equal, nan_policy = nan_policy)
return {"t-value": results[0], "p-value": results[1]}
def ttest_statistic(self, o1, o2, equal = True):
results = scipy.stats.ttest_ind_from_stats(o1["mean"], o1["std"], o1["nobs"], o2["mean"], o2["std"], o2["nobs"], equal_var = equal)
return {"t-value": results[0], "p-value": results[1]}
def ttest_related(self, a, b, axis = 0, nan_policy='propagate'):
results = scipy.stats.ttest_rel(a, b, axis = axis, nan_policy = nan_policy)
return {"t-value": results[0], "p-value": results[1]}
def ks_fitness(self, rvs, cdf, args = (), N = 20, alternative = 'two-sided', mode = 'approx'):
results = scipy.stats.kstest(rvs, cdf, args = args, N = N, alternative = alternative, mode = mode)
return {"ks-value": results[0], "p-value": results[1]}
def chisquare(self, f_obs, f_exp = None, ddof = None, axis = 0):
results = scipy.stats.chisquare(f_obs, f_exp = f_exp, ddof = ddof, axis = axis)
return {"chisquared-value": results[0], "p-value": results[1]}
def powerdivergence(self, f_obs, f_exp = None, ddof = None, axis = 0, lambda_ = None):
results = scipy.stats.power_divergence(f_obs, f_exp = f_exp, ddof = ddof, axis = axis, lambda_ = lambda_)
return {"powerdivergence-value": results[0], "p-value": results[1]}
def ks_twosample(self, x, y, alternative = 'two_sided', mode = 'auto'):
results = scipy.stats.ks_2samp(x, y, alternative = alternative, mode = mode)
return {"ks-value": results[0], "p-value": results[1]}
def es_twosample(self, x, y, t = (0.4, 0.8)):
results = scipy.stats.epps_singleton_2samp(x, y, t = t)
return {"es-value": results[0], "p-value": results[1]}
def mw_rank(self, x, y, use_continuity = True, alternative = None):
results = scipy.stats.mannwhitneyu(x, y, use_continuity = use_continuity, alternative = alternative)
return {"u-value": results[0], "p-value": results[1]}
def mw_tiecorrection(self, rank_values):
results = scipy.stats.tiecorrect(rank_values)
return {"correction-factor": results}
def rankdata(self, a, method = 'average'):
results = scipy.stats.rankdata(a, method = method)
return results
def wilcoxon_ranksum(self, a, b): # this seems to be superceded by Mann Whitney Wilcoxon U Test
results = scipy.stats.ranksums(a, b)
return {"u-value": results[0], "p-value": results[1]}
def wilcoxon_signedrank(self, x, y = None, zero_method = 'wilcox', correction = False, alternative = 'two-sided'):
results = scipy.stats.wilcoxon(x, y = y, zero_method = zero_method, correction = correction, alternative = alternative)
return {"t-value": results[0], "p-value": results[1]}
def kw_htest(self, *args, nan_policy = 'propagate'):
results = scipy.stats.kruskal(*args, nan_policy = nan_policy)
return {"h-value": results[0], "p-value": results[1]}
def friedman_chisquare(self, *args):
results = scipy.stats.friedmanchisquare(*args)
return {"chisquared-value": results[0], "p-value": results[1]}
def bm_wtest(self, x, y, alternative = 'two-sided', distribution = 't', nan_policy = 'propagate'):
results = scipy.stats.brunnermunzel(x, y, alternative = alternative, distribution = distribution, nan_policy = nan_policy)
return {"w-value": results[0], "p-value": results[1]}
def combine_pvalues(self, pvalues, method = 'fisher', weights = None):
results = scipy.stats.combine_pvalues(pvalues, method = method, weights = weights)
return {"combined-statistic": results[0], "p-value": results[1]}
def jb_fitness(self, x):
results = scipy.stats.jarque_bera(x)
return {"jb-value": results[0], "p-value": results[1]}
def ab_equality(self, x, y):
results = scipy.stats.ansari(x, y)
return {"ab-value": results[0], "p-value": results[1]}
def bartlett_variance(self, *args):
results = scipy.stats.bartlett(*args)
return {"t-value": results[0], "p-value": results[1]}
def levene_variance(self, *args, center = 'median', proportiontocut = 0.05):
results = scipy.stats.levene(*args, center = center, proportiontocut = proportiontocut)
return {"w-value": results[0], "p-value": results[1]}
def sw_normality(self, x):
results = scipy.stats.shapiro(x)
return {"w-value": results[0], "p-value": results[1]}
def shapiro(self, x):
return "destroyed by facts and logic"
def ad_onesample(self, x, dist = 'norm'):
results = scipy.stats.anderson(x, dist = dist)
return {"d-value": results[0], "critical-values": results[1], "significance-value": results[2]}
def ad_ksample(self, samples, midrank = True):
results = scipy.stats.anderson_ksamp(samples, midrank = midrank)
return {"d-value": results[0], "critical-values": results[1], "significance-value": results[2]}
def binomial(self, x, n = None, p = 0.5, alternative = 'two-sided'):
results = scipy.stats.binom_test(x, n = n, p = p, alternative = alternative)
return {"p-value": results}
def fk_variance(self, *args, center = 'median', proportiontocut = 0.05):
results = scipy.stats.fligner(*args, center = center, proportiontocut = proportiontocut)
return {"h-value": results[0], "p-value": results[1]} # unknown if the statistic is an h value
def mood_mediantest(self, *args, ties = 'below', correction = True, lambda_ = 1, nan_policy = 'propagate'):
results = scipy.stats.median_test(*args, ties = ties, correction = correction, lambda_ = lambda_, nan_policy = nan_policy)
return {"chisquared-value": results[0], "p-value": results[1], "m-value": results[2], "table": results[3]}
def mood_equalscale(self, x, y, axis = 0):
results = scipy.stats.mood(x, y, axis = axis)
return {"z-score": results[0], "p-value": results[1]}
def skewtest(self, a, axis = 0, nan_policy = 'propogate'):
results = scipy.stats.skewtest(a, axis = axis, nan_policy = nan_policy)
return {"z-score": results[0], "p-value": results[1]}
def kurtosistest(self, a, axis = 0, nan_policy = 'propogate'):
results = scipy.stats.kurtosistest(a, axis = axis, nan_policy = nan_policy)
return {"z-score": results[0], "p-value": results[1]}
def normaltest(self, a, axis = 0, nan_policy = 'propogate'):
results = scipy.stats.normaltest(a, axis = axis, nan_policy = nan_policy)
return {"z-score": results[0], "p-value": results[1]}

View File

@@ -7,17 +7,10 @@
# current benchmark of optimization: 1.33 times faster # current benchmark of optimization: 1.33 times faster
# setup: # setup:
__version__ = "4.0.0" __version__ = "3.0.0"
# changelog should be viewed using print(analysis.__changelog__) # changelog should be viewed using print(analysis.__changelog__)
__changelog__ = """changelog: __changelog__ = """changelog:
4.0.0:
- deprecated all *_obj.py compatibility modules
- deprecated titanlearn.py
- deprecated visualization.py
- removed matplotlib from requirements
- removed extra submodule imports in Analysis
- added typehinting, docstrings for each function
3.0.0: 3.0.0:
- incremented version to release 3.0.0 - incremented version to release 3.0.0
3.0.0-rc2: 3.0.0-rc2:
@@ -47,7 +40,6 @@ __all__ = [
"Analysis", "Analysis",
"Array", "Array",
"ClassificationMetric", "ClassificationMetric",
"Clustering",
"CorrelationTest", "CorrelationTest",
"Expression", "Expression",
"Fit", "Fit",
@@ -61,9 +53,9 @@ __all__ = [
] ]
from . import Analysis as Analysis from . import Analysis as Analysis
from . import Analysis as analysis
from .Array import Array from .Array import Array
from .ClassificationMetric import ClassificationMetric from .ClassificationMetric import ClassificationMetric
from . import Clustering
from . import CorrelationTest from . import CorrelationTest
from .equation import Expression from .equation import Expression
from . import Fit from . import Fit

View File

@@ -1,24 +0,0 @@
# Titan Robotics Team 2022: Metrics submodule
# Written by Arthur Lu
# Notes:
# this should be imported as a python module using 'from tra_analysis import metrics'
# setup:
__version__ = "1.0.0"
__changelog__ = """changelog:
1.0.0:
- implemented elo, glicko2, trueskill
"""
__author__ = (
"Arthur Lu <learthurgo@gmail.com>",
)
__all__ = {
"Expression"
}
from . import elo
from . import glicko2
from . import trueskill

View File

@@ -0,0 +1,222 @@
# Titan Robotics Team 2022: CUDA-based Regressions Module
# Not actively maintained, may be removed in future release
# Written by Arthur Lu & Jacob Levine
# Notes:
# this module has been automatically inegrated into analysis.py, and should be callable as a class from the package
# this module is cuda-optimized (as appropriate) and vectorized (except for one small part)
# setup:
__version__ = "0.0.4"
# changelog should be viewed using print(analysis.regression.__changelog__)
__changelog__ = """
0.0.4:
- bug fixes
- fixed changelog
0.0.3:
- bug fixes
0.0.2:
-Added more parameters to log, exponential, polynomial
-Added SigmoidalRegKernelArthur, because Arthur apparently needs
to train the scaling and shifting of sigmoids
0.0.1:
-initial release, with linear, log, exponential, polynomial, and sigmoid kernels
-already vectorized (except for polynomial generation) and CUDA-optimized
"""
__author__ = (
"Jacob Levine <jlevine@imsa.edu>",
"Arthur Lu <learthurgo@gmail.com>",
)
__all__ = [
'factorial',
'take_all_pwrs',
'num_poly_terms',
'set_device',
'LinearRegKernel',
'SigmoidalRegKernel',
'LogRegKernel',
'PolyRegKernel',
'ExpRegKernel',
'SigmoidalRegKernelArthur',
'SGDTrain',
'CustomTrain',
'CircleFit'
]
import torch
global device
device = "cuda:0" if torch.cuda.is_available() else "cpu"
#todo: document completely
def set_device(self, new_device):
device=new_device
class LinearRegKernel():
parameters= []
weights=None
bias=None
def __init__(self, num_vars):
self.weights=torch.rand(num_vars, requires_grad=True, device=device)
self.bias=torch.rand(1, requires_grad=True, device=device)
self.parameters=[self.weights,self.bias]
def forward(self,mtx):
long_bias=self.bias.repeat([1,mtx.size()[1]])
return torch.matmul(self.weights,mtx)+long_bias
class SigmoidalRegKernel():
parameters= []
weights=None
bias=None
sigmoid=torch.nn.Sigmoid()
def __init__(self, num_vars):
self.weights=torch.rand(num_vars, requires_grad=True, device=device)
self.bias=torch.rand(1, requires_grad=True, device=device)
self.parameters=[self.weights,self.bias]
def forward(self,mtx):
long_bias=self.bias.repeat([1,mtx.size()[1]])
return self.sigmoid(torch.matmul(self.weights,mtx)+long_bias)
class SigmoidalRegKernelArthur():
parameters= []
weights=None
in_bias=None
scal_mult=None
out_bias=None
sigmoid=torch.nn.Sigmoid()
def __init__(self, num_vars):
self.weights=torch.rand(num_vars, requires_grad=True, device=device)
self.in_bias=torch.rand(1, requires_grad=True, device=device)
self.scal_mult=torch.rand(1, requires_grad=True, device=device)
self.out_bias=torch.rand(1, requires_grad=True, device=device)
self.parameters=[self.weights,self.in_bias, self.scal_mult, self.out_bias]
def forward(self,mtx):
long_in_bias=self.in_bias.repeat([1,mtx.size()[1]])
long_out_bias=self.out_bias.repeat([1,mtx.size()[1]])
return (self.scal_mult*self.sigmoid(torch.matmul(self.weights,mtx)+long_in_bias))+long_out_bias
class LogRegKernel():
parameters= []
weights=None
in_bias=None
scal_mult=None
out_bias=None
def __init__(self, num_vars):
self.weights=torch.rand(num_vars, requires_grad=True, device=device)
self.in_bias=torch.rand(1, requires_grad=True, device=device)
self.scal_mult=torch.rand(1, requires_grad=True, device=device)
self.out_bias=torch.rand(1, requires_grad=True, device=device)
self.parameters=[self.weights,self.in_bias, self.scal_mult, self.out_bias]
def forward(self,mtx):
long_in_bias=self.in_bias.repeat([1,mtx.size()[1]])
long_out_bias=self.out_bias.repeat([1,mtx.size()[1]])
return (self.scal_mult*torch.log(torch.matmul(self.weights,mtx)+long_in_bias))+long_out_bias
class ExpRegKernel():
parameters= []
weights=None
in_bias=None
scal_mult=None
out_bias=None
def __init__(self, num_vars):
self.weights=torch.rand(num_vars, requires_grad=True, device=device)
self.in_bias=torch.rand(1, requires_grad=True, device=device)
self.scal_mult=torch.rand(1, requires_grad=True, device=device)
self.out_bias=torch.rand(1, requires_grad=True, device=device)
self.parameters=[self.weights,self.in_bias, self.scal_mult, self.out_bias]
def forward(self,mtx):
long_in_bias=self.in_bias.repeat([1,mtx.size()[1]])
long_out_bias=self.out_bias.repeat([1,mtx.size()[1]])
return (self.scal_mult*torch.exp(torch.matmul(self.weights,mtx)+long_in_bias))+long_out_bias
class PolyRegKernel():
parameters= []
weights=None
bias=None
power=None
def __init__(self, num_vars, power):
self.power=power
num_terms=self.num_poly_terms(num_vars, power)
self.weights=torch.rand(num_terms, requires_grad=True, device=device)
self.bias=torch.rand(1, requires_grad=True, device=device)
self.parameters=[self.weights,self.bias]
def num_poly_terms(self,num_vars, power):
if power == 0:
return 0
return int(self.factorial(num_vars+power-1) / self.factorial(power) / self.factorial(num_vars-1)) + self.num_poly_terms(num_vars, power-1)
def factorial(self,n):
if n==0:
return 1
else:
return n*self.factorial(n-1)
def take_all_pwrs(self, vec, pwr):
#todo: vectorize (kinda)
combins=torch.combinations(vec, r=pwr, with_replacement=True)
out=torch.ones(combins.size()[0]).to(device).to(torch.float)
for i in torch.t(combins).to(device).to(torch.float):
out *= i
if pwr == 1:
return out
else:
return torch.cat((out,self.take_all_pwrs(vec, pwr-1)))
def forward(self,mtx):
#TODO: Vectorize the last part
cols=[]
for i in torch.t(mtx):
cols.append(self.take_all_pwrs(i,self.power))
new_mtx=torch.t(torch.stack(cols))
long_bias=self.bias.repeat([1,mtx.size()[1]])
return torch.matmul(self.weights,new_mtx)+long_bias
def SGDTrain(self, kernel, data, ground, loss=torch.nn.MSELoss(), iterations=1000, learning_rate=.1, return_losses=False):
optim=torch.optim.SGD(kernel.parameters, lr=learning_rate)
data_cuda=data.to(device)
ground_cuda=ground.to(device)
if (return_losses):
losses=[]
for i in range(iterations):
with torch.set_grad_enabled(True):
optim.zero_grad()
pred=kernel.forward(data_cuda)
ls=loss(pred,ground_cuda)
losses.append(ls.item())
ls.backward()
optim.step()
return [kernel,losses]
else:
for i in range(iterations):
with torch.set_grad_enabled(True):
optim.zero_grad()
pred=kernel.forward(data_cuda)
ls=loss(pred,ground_cuda)
ls.backward()
optim.step()
return kernel
def CustomTrain(self, kernel, optim, data, ground, loss=torch.nn.MSELoss(), iterations=1000, return_losses=False):
data_cuda=data.to(device)
ground_cuda=ground.to(device)
if (return_losses):
losses=[]
for i in range(iterations):
with torch.set_grad_enabled(True):
optim.zero_grad()
pred=kernel.forward(data)
ls=loss(pred,ground)
losses.append(ls.item())
ls.backward()
optim.step()
return [kernel,losses]
else:
for i in range(iterations):
with torch.set_grad_enabled(True):
optim.zero_grad()
pred=kernel.forward(data_cuda)
ls=loss(pred,ground_cuda)
ls.backward()
optim.step()
return kernel

View File

@@ -0,0 +1,122 @@
# Titan Robotics Team 2022: ML Module
# Written by Arthur Lu & Jacob Levine
# Notes:
# this should be imported as a python module using 'import titanlearn'
# this should be included in the local directory or environment variable
# this module is optimized for multhreaded computing
# this module learns from its mistakes far faster than 2022's captains
# setup:
__version__ = "1.1.1"
#changelog should be viewed using print(analysis.__changelog__)
__changelog__ = """changelog:
1.1.1:
- removed matplotlib import
- removed graphloss()
1.1.0:
- added net, dataset, dataloader, and stdtrain template definitions
- added graphloss function
1.0.1:
- added clear functions
1.0.0:
- complete rewrite planned
- depreciated 1.0.0.xxx versions
- added simple training loop
0.0.x:
-added generation of ANNS, basic SGD training
"""
__author__ = (
"Arthur Lu <arthurlu@ttic.edu>,"
"Jacob Levine <jlevine@ttic.edu>,"
)
__all__ = [
'clear',
'net',
'dataset',
'dataloader',
'train',
'stdtrainer',
]
import torch
from os import system, name
import numpy as np
def clear():
if name == 'nt':
_ = system('cls')
else:
_ = system('clear')
class net(torch.nn.Module): #template for standard neural net
def __init__(self):
super(Net, self).__init__()
def forward(self, input):
pass
class dataset(torch.utils.data.Dataset): #template for standard dataset
def __init__(self):
super(torch.utils.data.Dataset).__init__()
def __getitem__(self, index):
pass
def __len__(self):
pass
def dataloader(dataset, batch_size, num_workers, shuffle = True):
return torch.utils.data.DataLoader(dataset, batch_size=batch_size, shuffle=shuffle, num_workers=num_workers)
def train(device, net, epochs, trainloader, optimizer, criterion): #expects standard dataloader, whch returns (inputs, labels)
dataset_len = trainloader.dataset.__len__()
iter_count = 0
running_loss = 0
running_loss_list = []
for epoch in range(epochs): # loop over the dataset multiple times
for i, data in enumerate(trainloader, 0):
inputs = data[0].to(device)
labels = data[1].to(device)
optimizer.zero_grad()
outputs = net(inputs)
loss = criterion(outputs, labels.to(torch.float))
loss.backward()
optimizer.step()
# monitoring steps below
iter_count += 1
running_loss += loss.item()
running_loss_list.append(running_loss)
clear()
print("training on: " + device)
print("iteration: " + str(i) + "/" + str(int(dataset_len / trainloader.batch_size)) + " | " + "epoch: " + str(epoch) + "/" + str(epochs))
print("current batch loss: " + str(loss.item))
print("running loss: " + str(running_loss / iter_count))
return net, running_loss_list
print("finished training")
def stdtrainer(net, criterion, optimizer, dataloader, epochs, batch_size):
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
net = net.to(device)
criterion = criterion.to(device)
optimizer = optimizer.to(device)
trainloader = dataloader
return train(device, net, epochs, trainloader, optimizer, criterion)

View File

@@ -0,0 +1,58 @@
# Titan Robotics Team 2022: Visualization Module
# Written by Arthur Lu & Jacob Levine
# Notes:
# this should be imported as a python module using 'import visualization'
# this should be included in the local directory or environment variable
# fancy
# setup:
__version__ = "0.0.1"
#changelog should be viewed using print(analysis.__changelog__)
__changelog__ = """changelog:
0.0.1:
- added graphhistogram function as a fragment of visualize_pit.py
0.0.0:
- created visualization.py
- added graphloss()
- added imports
"""
__author__ = (
"Arthur Lu <arthurlu@ttic.edu>,"
"Jacob Levine <jlevine@ttic.edu>,"
)
__all__ = [
'graphloss',
]
import matplotlib.pyplot as plt
import numpy as np
def graphloss(losses):
x = range(0, len(losses))
plt.plot(x, losses)
plt.show()
def graphhistogram(data, figsize, sharey = True): # expects library with key as variable and contents as occurances
fig, ax = plt.subplots(1, len(data), sharey=sharey, figsize=figsize)
i = 0
for variable in data:
ax[i].hist(data[variable])
ax[i].invert_xaxis()
ax[i].set_xlabel('Variable')
ax[i].set_ylabel('Frequency')
ax[i].set_title(variable)
plt.yticks(np.arange(len(data[variable])))
i+=1
plt.show()